The most important part of a blaster review is the data. Thanks to the efforts of the testing part of TeamNC, (contact Testing@NerfCenter.com for comments) the web-design part of TeamNC got to sit around basking in the warm glow of computer monitors. The testing staff's job was to lock themselves in their basements, testing blasters 'till all hours of the morning.
How the blasters were tested:
The blasters went through three tests, being (in no order whatsoever) an accuracy test, a maximum distance test, and a rate of fire test.
The accuracy test consisted of taking each gun and firing it several (80) times at a 1 by 2 foot target. The gun was fired at different distances (10, 15, 20, & 25 ft. - 20 shots for every distance) and the results were recorded. We then divided the results to obtain a percentage for each footage. The test was then repeated once again, but a 2 by 4 foot target was used.
The maximum distance test consisted of firing the blaster 10-20 times, and then marking down the average max. distance.
The rate of fire test was done by taking the blaster and shooting it 5 times, with another person timing the test with a stopwatch. The results were then divided up to find an average rate of fire per piece of ammo fired.
Although we originally planned to also complete a jam ratio test, it was concluded by TeamNC that jamming occurs randomly. It mostly is based upon defects and flukes in blaster design, and thus cannot be standardized easily.
The results were then shot over to the NC Webmasters and entered into the proper Nerf®Center Review page. A picture of the blaster was then taken and other proper facts related to the gun (NCR, TechRating, date of release, availability, average price, series, dimensions, and weight) were also recorded. The review was then posted online.
And, viola! Another blaster was successfully reviewed.