It’s been two months since I stared coding. I think better to tell that it’s been two months of trying to understand JavaScript. HTML and CSS was easy, I thought that I’ll be able to write my own error free code under three months. But I was wrong. JavaScript makes me crazy.
JavaScript is complex and occasionally irrational. I learn that 0.1 + 0.2 ≠ 0.3 (you read it right). According to JavaScript 0.1 + 0.2 === 0.30000000000000004. But full number looks like this0.3000000000000000444089209850062616169452667236328125. This makes no sense. But it’s reality. This is not error and happened because of floating point math. All numbers in JavaScript are binary (series of 0 and 1). And because of it (I have no idea why) JavaScript show this.
A few more examples of weird JavaScript math:




Follow my blog for more interesting content. Please share which tricks did you use to learn JavaScript.