macman76
User

Soundoffs 4
Album Ratings 142
Objectivity 62%

Last Active 09-25-20 9:08 pm
Joined 11-12-11

Review Comments 2,122

 Lists
06.14.18 World Cup 2018 04.16.18 Model Talk
04.09.18 Who wants to be a model?* 03.11.18 Most m/'s
02.28.18 Stat Tank 02.27.17 Blog Needs a Name
02.10.17 2016: To Be Sputnik Is to Be Nowhere 12.10.16 Sputnik: A Genre History
12.06.16 The Polls Did Alright12.04.16 Paired Genres
11.27.16 Album Rankings If Only Trve Fans Counte11.23.16 The Top 250 Users: The App
10.30.16 The Top 250 Users10.28.16 The Sputnik Years: The App
10.18.16 The Sputnik Years: The Uncharted10.17.16 2015: We Sputnik?
10.17.16 2014: Sputnik, Like NoPlace Is There10.16.16 2013: Sputnik Will Find Me
More »

Staff Metrics

I analyzed data from a few staff members. I looked to see whether their review ratings coincided with the average ratings of the sputnik users. My analysis says nothing of the causality. Are some staff members more influential than others, more similar to the Sputnik community, more contrarian, better looking? Those aren’t things this answers. Sorry. Also listed are their mean number and 68% confidence interval of the mean of the amount of ratings the albums they reviewed had, their bias relative to the average user rating (positive means that they rated higher than the users) and the Bayes factor of the comparison (values above one mean that there is evidence that staffs ratings are significantly biased, values below one mean evidence against the bias hypothesis), the polynomial statistical model that best fit the data (as assessed by Bayes information criterion with 8 as the highest order) and what the model predicts as the average score of an album that has a 1.0, 3.0, and 5.0 review from each staff member (essentially, what does a staff member’s 1.0, 3.0, and 5.0 review generally mean for the average rating). Values in each model were weighted by the log10 of the number of ratings that each album reviewed had. I chose to do this to reduce the influence of albums that had few ratings (which would, by small n, have more noisy estimates) while still not granting a ton of weight to the most rated albums. Albums listed represent the most outlying review of the staff member based on marginal medians calculated projection distance of review scores to the average rating score. This isn’t the biggest difference between the staff member’s review and the average Sputnik score (in fact, Omaha’s review score of Cypress Hill is a 2.5 and the actual score was a 2.2, he/she had other ratings that had above 0.8 rating units difference), but rather the review that deviated most from their joined trend.
1Justin Bieber
My World 2.0


Klap
Rating #s:
170.0 +- 16.1 Ratings
Bias:
+0.027 rating bias, Bayes Factor = 0.114 (strong evidence that there is no bias)
Model :
-0.5 + 2.6x-0.6x^2 + 0.1x^3 , R^2 = 0.63 (rank 2)
1.0 Review = 1.6 Average Sputnik Rating
3.0 Review = 3.3 Average Sputnik Rating
5.0 Review = 4.0 Average Sputnik Rating
Notes:
A third order polynomial best fit Klap’s rating. It may be partially be related to the fact that Klap has a lot of unique ratings (he has a review with almost every possible number that staff are allowed to rate with like 2.3 and 4.1). More points may make it more likely for a curve rather than a straight line to best fit a set of points (like a circle being more apparent when you’ve demonstrated it with 100 points rather than 3). At the same time, no albums reach the average rating of 1.0 or 5.0 even though individual reviewers will give that rating. Models of other reviewers show this compression; however, for klap, his/her tails rise faster than the center of his/her model.
2The Crash Motive
Consequence


SowingSeason

Rating #s:
394.6 +- 45 Ratings
Bias:
+0.178 rating bias, Bayes Factor = 2.7 (weak evidence that there is bias)
Model :
1.8 + 0.5x, R^2 = 0.57 (rank 3)
1.0 Review = 2.3 Average Sputnik Rating
3.0 Review = 3.2 Average Sputnik Rating
5.0 Review = 4.1 Average Sputnik Rating
Notes:
By sheer luck (or a sign of larger trend) the next three reviewers have almost the same model (note that I rounded the models values). Sowing is not a fan of his/hers non-.5-or-.0-decimal-which-makes-more-continuous-and-reliable-models. We could all benefit from a little nuance.
3The Summer Set
Legendary


Atari

Rating #s:
336 +- 50 Ratings
Bias:
+0.27 rating bias, Bayes Factor = 125 (very strong evidence that there is bias)
Model:
1.8 + 0.5x, R^2 = 0.65 (rank 1)
1.0 Review = 2.28 Average Sputnik Rating
3.0 Review = 3.25 Average Sputnik Rating
5.0 Review = 4.2 Average Sputnik Rating
Notes:
Our first probably biased reviewer. One discrepancy I have to admit, however, is that though I weighed the linear model to add importance to the more often rated albums, I was not able to do that in order to properly calculate a bias with those weights and their corresponding Bayes Factor. So, all Bayes Factors assume equal weight, which should make you a little more skeptical of their interpretation. Also, of these 8 staff members, Atari had the highest R^2 indicating that his/her model fits its trend really well.
4Jessica Simpson
Do You Know


Willie

Rating #s:
136.7 +- 14.9 Ratings
Bias:
+0.020 rating bias, Bayes Factor = 0.10 (strong evidence that there is no bias)
Model:
1.8 + 0.5x, R^2 = 0.48 (rank 6)
1.0 Review = 2.28 Average Sputnik Rating
3.0 Review = 3.25 Average Sputnik Rating
5.0 Review = 4.2 Average Sputnik Rating
5Cypress Hill x Rusko
Cypress Hill x Rusko


Omaha

Rating #s:
358.9 +- 45 Ratings
Bias:
+0.09 rating bias, Bayes Factor = 0.40 (weak evidence that there is no bias)
Model:
2.6 + 0.3x, 0.297 (rank 10)
1.0 Review = 2.9 Average Sputnik Rating
3.0 Review = 3.48 Average Sputnik Rating
5.0 Review = 4.05 Average Sputnik Rating
Notes:
Omaha has our most compressed range (1.0 to 5.0 review for him/her translates to a 2.9 to 4.05 range). His/Her model is also the least related to the average user rating (as assessed by his R^2). A positive spin of this information would be that his/her ratings are unique and unaffected by the masses (or nobody likes his/her opinion and his/her review value will be unrelated to your enjoyment of said album).
6Jimmy Eat World
Stay on My Side Tonight


DaveyBoy

Rating #s:
279 +- 27.3 Ratings
Bias:
+ 0.021 rating bias, Bayes Factor = 0.098 (strong evidence that there is no bias)
Model:
0.9 + 1x -0.1x^2, R^2 = 0.538 (rank 4)
1.0 Review = 1.84 Average Sputnik Rating
3.0 Review = 3.19 Average Sputnik Rating
5.0 Review = 3.9 Average Sputnik Rating
7Chroma Key
Dead Air For Radios


Voivod

Rating #s:
61.2 +- 8.9 Ratings
Bias:
+0.22 ratings bias, Bayes Factor = 131.9 (strong evidence that there is bias)
Model:
2.1 + 0.4x, R^2 = 0.306 (rank 9)
1.0 Review = 2.49 Average Sputnik Rating
3.0 Review = 3.32 Average Sputnik Rating
5.0 Review = 4.14 Average Sputnik Rating
Notes:
Voivod reviews some real underground shit as evidenced by the fact that the average number of ratings of said albums is unable to collect social security. Given the small standard error of the number of ratings, a weighting that favored the more rated albums would probably not change his bias value much, so his Bayes Factor may be accurate he may actually be positively biased.
8Fran Healy
Wreckorder


Irving

Rating #s:
142 +- 30.2 Ratings
Bias:
-0.036 ratings bias, Bayes Factor = 0.151 (strong evidence that there is no bias)
Model:
-1.7 + 4.4x -1.3x^2 + 0.1x^3, R^2 = 0.443 (rank 8)
1.0 Review = 1.5 Average Sputnik Rating
3.0 Review = 3.28 Average Sputnik Rating
5.0 Review = 4.27 Average Sputnik Rating
Notes:
The only bias that’s negative (though probably not significantly so) and our widest range for the user rating. Irving’s model fit also looks a lot like klap’s (thank you unique review values).
9Megadeth
Super Collider


AtomicWaste

Rating #s:
239 +- 36
Bias:
0.165 ratings bias, Bayes Factor = 0.916 (very little evidence in favor of the no bias hypothesis)
Model:
2.26 + 0.36x, R^2 = 0.531 (rank 5)
1.0 Review = 2.5 Average Sputnik Rating
3.0 Review = 3.35 Average Sputnik Rating
5.0 Review = 4.08 Average Sputnik Rating
10Skrillex
Recess


Brostep

Rating #s:
134 +- 50.9 ratings
Bias:
0.111 ratings bias, Bayes Factor = 0.27 (weak evidence that there is no bias)
Model:
1.9 + 0.4x, R^2 = 0.452 (rank 7)
1.0 Review = 2.3 Average Sputnik Rating
3.0 Review = 3.1 Average Sputnik Rating
5.0 Review = 3.9 Average Sputnik Rating
Show/Add Comments (73)

STAFF & CONTRIBUTORS // CONTACT US

Bands: A B C D E F G H I J K L M N O P Q R S T U V W X Y Z


Site Copyright 2005-2023 Sputnikmusic.com
All Album Reviews Displayed With Permission of Authors | Terms of Use | Privacy Policy