Write a program to display a histogram representation of an integer array. In the simplest case, we could just print a number of *'s equal to the value of each entry in the array. For instance, if we...


Write a program to display a histogram representation of an integer array. In the simplest case, we could just print a number of *'s equal to the value of each entry in the array. For instance, if we had the array


int a[] = {7, 3, 2, 5};


then we could print 7 stars, 3 stars, 2 stars and 5 stars:


*******
***
**
*****


But that approach doesn't work very well if the numbers get very large, and isn't very attractive when the numbers are very small either. What we really want is for the line of stars to be proportional to the value in the array, so that if we had the numbers {7000, 3000, 2000, 5000} in the array, it would look the same as {7,3,2,5}. To do this properly, you want to scale the number of stars up or down properly. First decide what you want your longest line of stars to be, let's say 60. Using my original example, I would like the largest number (7) to be displayed with 60 stars. This defines my scale factor (60 / 7) in this case. Each number in the array should be scaled by the same factor so that their sizes are proportional. Your scale factor should be a float to properly scale down large numbers. For instance, if my largest value is 7000, then the scale factor would be 60 / 7000, but that is 0 in integer arithmetic.


To test your program, use data from the dice rolling exercise in lab 11. We hope to see a classic bell-curve shape

Nov 21, 2021
SOLUTION.PDF

Get Answer To This Question

Related Questions & Answers

More Questions ยป

Submit New Assignment

Copy and Paste Your Assignment Here