We are thinking of possible changes to our reviewing procedure and will like others views on it. Please also post a reply if you have anything to add or want to discuss.
Please tick any option that you feel is right/good/not needed/needed etc.
We are deciding to cut down in sice and tests to be able to provide more reviews and faster reviews of new drives - keep that in mind.
We are also thinking of cutting down on the media used, as us reviewers has to pay for most of the media ourselves! Also notice that us reviewers do not get paid at all - so basically each review cost us the money used on media + a lot of time and work!
CDFreaks or somebody else should provide you with some 21-st century broadband line. I don’t have a system looking as great as your review machine but I have 100Mbps lines with which I’m able to upload and download multi-TB files each day.
I chose a few in the poll. For instance, there are too many media used I think. Do the low-cost media justify the cost for reviews? If they want to be reviewed at all, they can send their 50-pack via DHL. Usually, low-quality media take even more time to test.
I often feel there are too few good reviews on the web. Many will like more reviews instead of few reviews that are longer.
CDFreaks reviews are really great, but in my humble opinion, they could be made a bit better by insiting more on some points like product guarantee, reliability history, and less on company history (who cares ?) , packet writting (never saw anyone really use it) or extensive tests of media.
A very nice addition to the reviews would be some kind of followup, a few weeks or months after, when the hardware has been extensively used by many people. Something like “While the unit we reviewed was very good/bad, most users experienced a bad/good units which such and such problems (or not). Firmware upgrades solved this problem (or not).”. Of course, one can scan the forums, but having a summary at the end of the review would be much more efficent for the potential buyer.
Personally I think the most important thing in a DVD writer is how good it is at writing DVDs so please don’t make the tests of DVD media shorter. CD media is ok to test somewhat fewer media imho (ie. include only the most common media).
Company info I don’t care much about, not packet writing or game backups either. Audio backups including EAC extraction and tests of different copy protections I want though.
More reveiws would be nice but I prefer quality over quantity and tests of different drives based on the same OEM are not very interesting imho.
It’s my feeling that too many reviews (regardless of the reviewers and sites) of DVD writers allocate too much on CD reading and CD writing. Most people are going to use DVD writers to write DVD disks. Much more time is spend to write DVD+R and DVD-R than to read and write CD. Also, there’s not that much to test with CD anymore especially since the DVD writers for CD writing are mostly the same things at 12x, 16x, 24x, 32x, 40x, and 48x. Could be different if it’s 56x or 60x but that’ll never happen probably. My idea is that spending resources, including time and money, on CD is waste if the same resources can be used for more DVD writers to write more DVD disks.
Company information and the history is important. If one compare how much time is needed to write the company information with the importance of the data, it shouldn’t be regarded as unnecessary. Usually, it’s just a copy-and-paste work. I’d suggest add company information only when the manufacturers themselves send the company information all prepared for the reviews in a decent form. If not, just copy and paste some “About Company” page of the manufacturer site. It shouldn’t take more than one minute.
More DVD writer reviews are great. But it’s more important to make it as early as possible. If it’s months after the prodcut release to retail and OEM, it’s likely that too many prospective readers of the reviews are going to turn their eyes on the next-generation drives. More seamless cooperation with the drive and media manufacturers (and also the main distributors) are necessary without doubt.
Doing a review on the same drive is not only a waste of resources but also causes confusion to the readers. But then most of the contents used in the first drive reviewed can be reused. If Lite-On SOHW-832S was reviewed, for instance, three weeks ago from August 1, CDFreaks can still review Sony DRU-700A on August 2 without testing the same hardware all over again but the introduction and conclusion better add that they are identical in most parts except warranty, prices, firmware, etc.
Company history can be axed, I guess. Packet writing doesn’t sound that exciting, either. Game backups are important, because I think that a lot of people care about that (and I’ll admit, it was one huge plus for me when chosing my drive back in January–even though I never ended up using it for that… hehe).
CD media tests could be whittled down a little bit, I suppose. DVD media tests should be kept as-is, with perhaps a bit of whittling down on the exotic, hard-to-find bad media types. Something to consider for the media tests… after reviewing RICOHJPNR01, for example, it might be worthwhile to note what brands people could find this media as. Like say, “This was a Fuji-branded xxxx disc, which have been known to also be sold under other common brands such as Memorex, Sony, blah, blah, blah.”
Reading tests… I think that reading tests comparing a 8:10 read with a 8:30 read could be changed… perhaps just note that it reads at 7x-16x CAV with a read time of approximately 5 minutes (round here). The reason being is that there are people who get caught up in things like that and would pick drive A over drive B because it can read a disc 5 seconds faster–not everyone knows that quality is a balance between both speed and write quality. If the difference is 30 seconds or more, it might be worth noting…
CDFreaks has the best reviews in all the net. I would like to see some more games added to the game tests. A Safedisk 3.15 game would be very nice. I would also like to see more DVD+RW tests, with succesful reading test in adn average DVD-ROM drive. This media is deffinitely one of the must tricky to write-read correctly
I think the CD portions of the reviews should be more limited, but keep the game copying and audio copying parts, since those are still important to some readers.
I think the DVD media quality testing section should be kept the same size, but more tests should be run on each disc, to get a more balanced view of playback performance. A second test run on the PX-712a would be a great start!
Hm. Removing CD tests entirely and extending the DVD media tests could require more time for each review. Multiple tests on each disc sounds scary to me.
There are always new firmware versions and hacked versions and new batches of media and recording software bux fixes and price drops, etc. It is impossible to take all those into consideration while performing the review.
Detailed media tests are usually done in the DVD Media Tests forum.
Would like to see 3 media compatibility features:
-could you take into the reviews media compatibility of Memorex cdrw and dvd+/-rw
-could you take into the reviews media compatibility of Philips cdrw and dvd+/-rw
-could you take into the reviews best compatibility (handy to see what an drive does support, so one knows which media is the best to buy for an drive)
Yes, the 2x testing takes a while, which is why I do my personal testing at 2x-5x, or 5x-12x. All my official testing is done at both 2x and 5x-12x to show how the disc handles various speeds. I run both tests for both PIE/POF and PIF/POF settings. I also include Jitter tests, transfer rate tests, and then do a K-Probe scan at 8x, and a transfer rate test at 8x on my LiteON 832s.
So yes, I know ALL about how long it takes, and I know exactly what I’m asking for.
Alternatively, you could always rent out a Datarius? I’m sure I could assist you in that if you were interested
And for what it’s worth, anyone doing drive reviews should get paid for it in my opinion. I may not agree with some of the reviews here personally, but the time and effort and money put into them should be at least partially reimbursed, if not completely!
I’d like to see for write quality sections that a Q-Check is done on the PX-712A, and a full transfer rate test done in CD Speed on a drive like the SOHD-167T, or similar. That shows the true writing quality of the drive, as you can not tell from just one or the other. A scan on the 712, and a scan on the SOHW-832S (or similar) would be best, coupled with the cd speed graph, but I realize that that would take WAY too much time than you can give per each review.
So, realistically, just a scan on the 712, and a transfer rate test for each DVD. The scans done at 2x, as higher speeds are somewhat unreliable.
In my opinion, tests are fine, but as is already said here, who cares about the company info? If someone does, he/she can always go to the company’s site and find all the info he/she requires. I would personally like to have the cd-r tests, too, especially with DVD writers as some of us replace CD-RW with DVD -/+ RW, but with emphasis on the copy protections and how these are handled. Quality of burning is also important.