# Interesting Math Problem

Iain King iainking at gmail.com
Thu Jun 5 10:48:10 CEST 2008

```On Jun 4, 9:03 am, "BEES INC" <bees.... at gmail.com> wrote:
> I've been awfully busy programming lately. My Django-based side
> project is coming along well and I hope to have it ready for use in a
> for now. Anyways, I came across an interesting little math problem
> today and was hoping some skilled programmers out there could come up
> with a more elegant solution than mine.
> Problem: Star Ratings
>
> People can rate cheeseburgers on my website with a star rating of 0-5
> stars (whole stars only), 5 being mighty tasty and 0 being disgusting.
> I would like to show the average of everyone's ratings of a particular
> cheeseburger to the nearest half star. I have already calculated the
> average rating as a float (star_sum) and the total number of people
> that rated the particular cheeseburger (num_raters). The result should
> be stored as a float in a variable named "stars."
> My Solution (in Python):
>
> # round to one decimal place and
> # separate into whole and fractional parts
> parts = str(round(star_sum/num_raters, 1)).split('.')
> whole = int(parts[0])
> frac = int(parts[1])
> if frac < 3:
> ___frac = 0
> elif frac > 7:
> ___frac = 0
> ___whole += 1
> else:
> ___frac = 5
> # recombine for a star rating rounded to the half
> stars = float(str(whole)+'.'+str(frac))
>
> Mmmm… In-N-Out Burgers… Please reply if you've got a better solution.

It'd be easier just to do the whole thing with ints.  Represents your
stars by counting half-stars (i.e. 0 = no stars, 1 = half a star, 2 =
1 star, etc).  Then you just need to divide by 2 at the end.

stars = round(star_sum/num_raters, 0) / 2.0

Iain

```