Statistics
10,652 total views | Who I Am...runner and parent; former xc coach Latest BlogsNo articles found
Wall - 0 followersLatest NewsNo articles found
| VideosYou can link to any video on RunnerSpace and put it in your video box on your profile! |
Doug Soles, on , said:
I think the problem with using a Watchout style ranking method in one particular state vs. the entire country is that the teams race head to head often and people are well aware of the results. It is very hard to know who is better in CA vs. NY without racing head to head, so someone like Watchout and a little math help to give insight on that. It is very clear from the most recent results which teams are better than others in CA, especially which ones will perform better on an important course like Mt. SAC. After a big meet like that, with all the data provided by you on this site, it is easier for people to say which teams are ahead of the others, especially as we enter the end of the season. I have seen this with our own team in the rankings, as our boys depth and inconsistency from our top guys has still somehow produced us staying up in the rankings ahead of teams that have clearly outperformed us recently. My suggestion to you would be if you like the math season long strategy, that is great. Leave yourself some wiggle room to adjust the order based on the obviousness of results (a team that has consistently beat another one but has a little less math because of a meet or two difference in attendence) if you want the vast majority of the people looking at your rankings to give them any credence.
Just my opinion, keep up the hard work. It is appreciated.
Doug
Thanks Doug I appreciate that - and that's exactly what I've been trying to do!
From summer to the first rankings after Woodbridge, I stuck with the straight math approach - and got a lot of feedback that I seemed to disregard specific match-ups at Woodbridge. Since then I have been doing a blend of the calculated best and the most recent head to head results. So a team may calculate higher than their recent performance (i.e. the GO boys), and I slot them somewhere in the middle using some judgement. For the GO boys I dropped them from 3rd to 6th, even though their performance at Mt. Sac was lower than that - their earlier performances supported that ranking.
If people want to just see the most recent performances in order, I already published the merged scores. Rankings can't rely solely on that last performance. Do I get that right every time? Absolutely not. But that's the plan!
Scott Joerger, on , said:
Honestly I show the Mira Costa and Redondo girls as neck and neck - it came down to a judgement call that could have gone either way. The Bay League is stepping up and producing some of the best teams in California once again!
Scott,
I think the problem with using a Watchout style ranking method in one particular state vs. the entire country is that the teams race head to head often and people are well aware of the results. It is very hard to know who is better in CA vs. NY without racing head to head, so someone like Watchout and a little math help to give insight on that. It is very clear from the most recent results which teams are better than others in CA, especially which ones will perform better on an important course like Mt. SAC. After a big meet like that, with all the data provided by you on this site, it is easier for people to say which teams are ahead of the others, especially as we enter the end of the season. I have seen this with our own team in the rankings, as our boys depth and inconsistency from our top guys has still somehow produced us staying up in the rankings ahead of teams that have clearly outperformed us recently. My suggestion to you would be if you like the math season long strategy, that is great. Leave yourself some wiggle room to adjust the order based on the obviousness of results (a team that has consistently beat another one but has a little less math because of a meet or two difference in attendence) if you want the vast majority of the people looking at your rankings to give them any credence.
Just my opinion, keep up the hard work. It is appreciated.
Doug
harrier4, on , said:
Yes Mira Costa had a better showing at Mt. Sac - in fact, I show all 5 scoring girls with their best season performance this weekend. However, Redondo had a couple scorers that performed better at earlier meets, giving them the edge in overall season calculated scoring.
Honestly I show the Mira Costa and Redondo girls as neck and neck - it came down to a judgement call that could have gone either way. The Bay League is stepping up and producing some of the best teams in California once again!
missy59, on , said:
Understand the rankings are not just the most recent performances in order. They take into account the best 5 individual performances throughout the season, with more recent performances given more weight.
I would need to know which team you are talking about to provide a more specific answer.
On the girls side, the front end went much as expected. Saugus had a great performance after staying under the radar recently, and jumped from 16 to 7. St. Francis of Sacramento proved themselves to be a top 10 team, jumping from 14th to 9th. Serrano showed up in a big way running a top 10 team time, and jumping from 22nd to 13th as a result. Mira Costa also had a number of season best performances, coming back into the ranings at 19th
A few highlights for the boys: De La Salle had a stellar performance in their first high profile meet since Stanford, jumping back in the rankings all the way to 7th. Burbank was the other team with a huge move, going from unranked to 8th following a great 5th place in the merge. D3 up-and-comer Rubidoux vaulted up to 12th followig a very solid team perforamnce. Finally, Warren also had a great run, and jumped from 13th to 4th.
Certainly I could highlight more, but those were the big upward movers. These rankings are a blend of teams best performances (date adjusted), and their most recent team showing.
I welcome comments and questions as always.