From Java to JavaScript — Null

Marius Reimer
3 min readJan 16, 2019

--

When working with programming languages, each one of them may do things differently. But the majority of them have something in common: variables and types. A variable can have a specific type of value assigned to it. Depending on the language, the type system may be different, especially when it comes to Java and JavaScript (see my article about types). In short, type system means that a variable may be a string first, but later become to an integer. When it comes to non-assigned types, languages to that differently too.

Null and Default Types in Java

Java has a really simple and predictably way of handling null and default types. Each variable that extends from java.lang.Object may become null and is null on default. All other variable types like int, boolean, float and char have specific values assigned to them on default, if you don’t initialize them by yourself. For example, boolean’s are always false on default:

https://docs.oracle.com/javase/tutorial/java/nutsandbolts/datatypes.html

These are the default values for primitive data types in Java, according to Oracle. For you as a programmer / coder this means that you don’t necessarily need to initialize these values on default, but most of the times it’s a code convention and better to do than not.

Null and Default Types in JavaScript

As I already exlained in one of my last articles, JavaScript does not have a statically strict type system like Java. One the one side this is good, on the other side you can write pretty concise code and move fast with JavaScript.

In JavaScript, each variable is undefined on default, so it hasn’t been initialized yet. On the other side, null means that you assigned something to that variable, so it is non-empty or currently not existing/available.

let str;console.log(typeof str);str = 'some string';console.log(typeof str);str = null;console.log(typeof str);
console.log(str == null); // true
console.log(str == undefined); // true!
console.log(str === undefined); // false!

This example shows the JavaScript type system in action. At first, str declared and not initialized, so it is undefined by default. Line 5 makes the variable to a string and 9 to null. The last the lines are the most interesting here. In JavaScript, null equals (==) to undefined, because it is ‘minified’ to undefined. This is without type checking. If you also check for type equality, you would use the last line (===).

To conclude:

  • === considers the type. So '42' !== 42
  • == does not consider the type. So '42' == 42

Originally published at mariusreimer.com on January 16, 2019.

--

--

Marius Reimer
Marius Reimer

Written by Marius Reimer

React Native Mobile Developer since 2017 // Freelancer

No responses yet