Frequently Asked Questions
We get lots of emails, article comments and feedback on social media, asking the same questions about our testing procedures and DXOMARK in general. We’ve put this FAQ together to help answer some of those questions.
The DXOMARK Speaker text protocol is designed to grade the consumer experience when using the device’s playback function. It evaluates audio quality attributes across a range of use case scenarios and, like our Camera, Selfie, and Audio test protocols, combines objective measurements in the lab under controlled conditions with perceptual evaluation methods.
Tests are carried out using a wide selection of audio clips, including commissioned music tracks, voice, and other multimedia content. The DXOMARK Speaker overall score is computed from all scores and measurements obtained during the testing process.
For more details about our Speaker protocol, click here.
All our smartphone Audio tests are undertaken in a dedicated state-of-the-art Audio lab. DXOMARK engineers perform a variety of objective tests and undertake more than 20 hours of perceptual evaluation under controlled lab conditions. The test protocol is based on a comparison between devices and uses several typical Audio use cases, for example, rear and front camera video, recording of memos, listening to movies, watching movies or gaming.
Playback and Recording performances are evaluated using the built-in speakers and microphones, respectively, and default apps. For more details about our Playback protocol, click here. For more details about our Recording protocol, click here
For the DXOMARK Camera and Selfie reports, devices are tested at default settings both in controlled lab environments and in natural indoor and outdoor scenes, covering a wide range of light conditions and user scenarios. Overall, an engineering team works for a week to capture and evaluates over 1,600 test images and more than 2 hours of video for each review.
Scores rely on purely objective tests for which the results are calculated directly by the test equipment, and on perceptual tests for which we use a sophisticated set of metrics to allow a panel of image experts to compare various aspects of image quality that require human judgment. A number of sub-scores are condensed into Photo and Video scores, which are then combined into an Overall score.
Every device is tested in exactly the same way, in identically configured lab setups, using the same test procedures, the same scenes, the same types of image crop rankings and the same software. This means DXOMARK results are reliable and repeatable. For more information about our DXOMARK Mobile test protocol, click here, or here to find out more about our scores. For more information about our DXOMARK Selfie test protocol, click here.
DXOMARK tests displays under real-world conditions using objective and perceptual evaluation methods, and exposing the displays under test to a range of ambient light levels ranging from near-pitch darkness to bright daylight (above 30,000 lux).
We test for display quality attributes such as readability, color, video, motion, touch, and artifacts in a number of typical use case scenarios, including web browsing, night reading, in-car navigation, and watching movies.
All devices are tested in a new state-of-the-art laboratory that is entirely dedicated to display testing, and equipped with the latest testing instruments, including spectroradiometers, imaging colorimeters, conoscopes, high-end DSLRs, and professional reference monitors, among other tools. We also use a custom-built display bench which facilitates testing automation and ensures that all devices are tested under the exact same conditions.
For more information, read our detailed articles about display testing here and here.
For our DXOMARK camera sensor reviews, we only measure the image quality of cameras that are capable of capturing images in RAW format. All measurements are taken before demosaicing or any JPG processing has taken place.
Camera sensors are tested for the following criteria:
- Color sensitivity
- Noise (standard deviation, signal-to-noise ratio, dynamic range and tonal range)
- ISO sensitivity (speed)
For more information about our camera sensor test protocol, click here.
For our DXOMARK lens reviews, we evaluate the performance of interchangeable lenses for cameras equipped with sensors that can capture images in RAW format. Lenses are tested mounted on cameras and measured for the following criteria:
- Sharpness, derived from the MTF (Modulation Transfer Function) measurement
- Distortion and Chromatic aberrations
- Vignetting
- Light transmission (T-stop). For more information about our lens test protocol, click here.
No, the testing methodologies are very different. Have a look at questions above in this FAQ for more detail on all our testing protocols
Manufacturer research shows that the majority of users capture smartphone images exclusively at default settings. We want our reviews to be relevant for as many users as possible, which is why we decided to do most of our testing at default settings in auto mode.
That said, there are some exceptions. With the latest update to our test protocol we have added a couple of manually operated features (zoom and bokeh), and we aim to add other frequently used special features in the future. In addition in video mode we use the resolution that yield the best results, even if it’s not the default setting (see next question).
We always test a smartphone camera’s video mode at the resolution that yields the overall best result and which is not necessarily the default setting. Resolution is only one factor that helps determine the overall score. While 4K offers the best resolution, some devices do not offer image stabilization or other features when recording in 4K which means that using a lower resolution setting would provide better overall image quality and lead to a higher score.
Our publication schedule depends on a range of factors. Device availability is one of the most important ones, however. Sometimes reviewable production units are available on or even before the day of announcement from the manufacturer. On other occasions it can take months after a device has been officially launched before we are able to acquire it on the market. Most of our reviews are of devices that are publicly available.
Other factors include the level of interest in a device by our readers (as analyzed through search queries, feedback messages, article comments etc.), the level of innovation in a product and, last but not least, our own interest in the product.
A lot of work goes into our smartphone tests. As of Nov. 1, 2019, DXOMARK is allowing retesting of some devices with new firmware if significant improvements had been made. But because of limited staff and testing capacity, we will not be able to retest the same device multiple times. The decision to retest will be subject to our discretion.
All devices are tested using the same protocol and methodology. If brand A consistently achieves better test results than brand B that simply means that brand A’s devices come with better main cameras, selfie cameras or audio systems.
For each camera, front camera or audio test DXOMARK engineers spend approximately two weeks recording and evaluating sample images or audio files. Then it takes a couple of days to convert the test results into an article that can be published on the site. Like everyone else’s our resources are limited and we have to be selective in terms of test devices. Selection depends on a range of factors, including device availability, level of interest in a device by our readers (as analyzed through search queries, feedback messages, article comments etc.), the level of innovation in a product and, last but not least, our own interest in the product.
The goal of any test protocol is to cover as many use cases as possible. Defining use cases and a preliminary test process for each is therefore the first step in the process. We then design the measuring tools and protocols required to evaluate test devices in an accurate and reproducible way.
Once those tools and protocols are in place we run a wide range of tests on several reference devices to design a robust scale/scoring system for all test attributes. We eventually fine tune the scoring system using the results from our reference tests. This ensures that final test results will be relevant from an end-user point of view and that they are fine-grained enough to highlight even the tiniest differences between devices.
At this point the protocol and scoring system are fixed and we can use them for testing. Every device goes through exactly the same test protocol and is scored using the same scoring system until the next protocol update. During the entire process we are in touch with key players in the industry. This allows as to get their feedback on our proposed methodologies and to be aware of any new technologies that will be released in the nearer future.
DXOMARK has a long history working closely with the imaging and mobile industries. Long before the DXOMARK website was launched, in the early 2000s, we designed Analyzer, the first comprehensive suite of hardware and software for camera testing and tuning which is today deployed at more than 150 sites all over the globe. From those early days we have always had in-depth technical discussions with our customers to help us understand their requirements and how new technology under development should be measured. This also helps us keep up to date with the continuous evolution of camera technologies.
During the mid-2010s the importance of the camera in smartphones became obvious, and our expertise and reputation in the field meant that we were approached by key players in the mobile industry, asking us to help them optimize their camera products.
Since the very beginning of these relationships we have strictly separated our editorial activities from B2B activities, with the editorial team exclusively making decisions on publication schedules and policy. Independent of brand or manufacturer, all test devices undergo exactly the same test protocols, and manufacturers are not paying to use our scores in their marketing material.
In some cases, our relationship with manufacturers also helps us get access to pre-production units and early firmware versions. This allows us to test devices sooner than would otherwise be possible. In these circumstances we will always purchase a commercially available device in a store at a later stage and confirm the original test results.