# Entropy

Lognormal distribution differential entropy.

The differential entropy (in nats) for a lognormal random variable is

where μ is the location parameter and σ > 0 is the scale parameter. According to the definition, the natural logarithm of a random variable from a lognormal distribution follows a normal distribution.

## Usage

var entropy = require( '@stdlib/math/base/dists/lognormal/entropy' );


#### entropy( mu, sigma )

Returns the differential entropy for a lognormal distribution with location mu and scale sigma (in nats).

var y = entropy( 2.0, 1.0 );
// returns ~3.419

y = entropy( 0.0, 1.0 );
// returns ~1.419

y = entropy( -1.0, 2.0 );
// returns ~1.112


If provided NaN as any argument, the function returns NaN.

var y = entropy( NaN, 1.0 );
// returns NaN

y = entropy( 0.0, NaN );
// returns NaN


If provided sigma <= 0, the function returns NaN.

var y = entropy( 0.0, 0.0 );
// returns NaN

y = entropy( 0.0, -1.0 );
// returns NaN


## Examples

var randu = require( '@stdlib/random/base/randu' );
var entropy = require( '@stdlib/math/base/dists/lognormal/entropy' );

var sigma;
var mu;
var y;
var i;

for ( i = 0; i < 10; i++ ) {
mu = ( randu()*10.0 ) - 5.0;
sigma = randu() * 20.0;
y = entropy( mu, sigma );
console.log( 'µ: %d, σ: %d, h(X;µ,σ): %d', mu.toFixed( 4 ), sigma.toFixed( 4 ), y.toFixed( 4 ) );
}