Skip to content

Histogram

Definition

A histogram is the representation of range w.r.t. a numeric variable of the given data. It comprises of an X-Y plane. On X-axis there is range whereas on the Y-axis there is a bar that represents the units lying in that range.

Why do you need it?

We need a histogram to have a tidy visualization of our irregular data to understand in a meaningful way. The higher the bar in the histogram higher will be the density of units falling in that range. Different colors can be assigned to each bar as per different ranges for making the histogram eye-catching.

What kind of data you can visualize with it?

A histogram is best suited to you if you have range-related data. If your dataset has a range like the height of trees in cm and count of trees lying in that range then you can undoubtedly use the histogram. Examples of such types of range-related data are frequency, temperature, height, etc. The histogram in which you use frequency as a range is known as a frequency histogram.

Category

  • Distribution

Similar