# Java Program to Calculate Entropy 2 Ways | Java Programs

Java program to calculate entropy in 5 ways. Entropy, in short, is the average level of information or uncertainty in a particular variable’s possibility of outcomes.

Our problem statement here is, to write a code to find the entropy of a number of messages. For this, our required inputs are, the number of messages (n) and the probability of each message (p) which can be fractional so we prefer making use of double datatype. Our expected output is the entropy value which is also of type double.

Firstly, to gather our requirements or inputs at runtime, we can make use of Scanner class in Java. This can be used to read input at runtime for any primitive datatype. So, to make use of predefined methods in this class, we first create an object for this class.

Scanner input = new Scanner(System.in);

We will then read the number of messages at runtime using the scanner class as follows:

n = input.nextDouble();

Since entropy is the uncertainty for all the possible outcomes, we will read the probability of each message from 1 to n and using the predefined log() method of Math package, we will use the formula below and keep adding the uncertainty of each message to the entropy as follows:

for(int i=1;i<=n;i++) {

System.out.println(“\nEnter the the probability of message “+i +”:”);

p = input.nextDouble();

entrophy+=(p*(Math.log(1/p))/Math.log(2));

}

As you can observe, in case of each message uncertainty is calculated as added to the value of entropy which in the beginning of the code was initialized of zero. This sum total by the end of the loop is nothing but, our expected output i.e., the entropy of the source which is stored in the resultant variable (entropy).  This is then, printed or displayed on the console screen by making use of System.out.println() method to display in a new line.

System.out.println(“\nThe entrophy of a source is :”+ entrophy);