About Loadout Lab — Who We Are and Why We Test the Way We Do
The lab, unfiltered

Built because
the reviews that
ranked highest
said the least.

Loadout Lab started in 2022 as a private spreadsheet. The founder — a competitive FPS player who’d been gaming seriously since middle school — had burned through three different gaming mice in eight months following YouTube recommendations. Each one was perfectly reviewed, enthusiastically bought, and quietly disappointing by week six. The spreadsheet was just notes: what the spec sheet claimed, what his hands actually felt after a month, what he’d tell someone who asked.

He showed it to a few people on Discord. Then more people asked for it. Then someone asked why there wasn’t a site that worked this way — rigorously, without the production budget optimism, without the affiliate incentive to push the most expensive option. The honest answer was: because it’s slower and harder than running an algorithmic content machine. That was the gap.

The first six months, the site published three reviews. Three. Because nothing else had cleared four weeks of real use. That pace cost early traffic and probably some frustrated readers who wanted more content. It also built the only thing this site actually cares about: a track record you can verify.

A person focused at a gaming desk setup in a low-lit room, multiple peripherals visible
Jordan Reyes, Founder Testing desk, Austin TX · 2024
What we do every day

We buy gear with our own money and tell you honestly what it’s worth.

Every product in a Loadout Lab review was purchased at full retail price. No review units. No early access arrangements. No PR outreach emails. The testing process runs a minimum of four weeks across multiple daily sessions, multiple game genres, and multiple comparison products at the same price point.

We score build quality, software stability, real-world performance, and long-term feel as separate metrics — because they’re genuinely different things. A mouse that scores perfectly on sensor accuracy and terribly on cable drag is not an “overall 8/10.” It’s a specific product for a specific type of player.

Where this is going

Change how this category treats people who actually know what they want.

The gaming peripheral review space currently operates on two modes: influencer hype cycles and spec-sheet paralysis. The reader who’s done their research, has strong opinions about sensor interpolation, and knows they want a sub-60g mouse for low-sense FPS — that person is constantly underserved by content that assumes they know nothing.

Loadout Lab is building the resource that reader actually needs: technically thorough, direct about trade-offs, and honest about the cases where paying more doesn’t buy you anything real. The goal isn’t to be the biggest gaming site. It’s to be the one people trust when the purchase actually matters to them.

The principles that
run the lab.

These aren’t wall plaques. They’re the specific rules we follow that make this site different from the ones we started reading before we decided to build our own.

The 28-Day Rule

Nothing gets written before the product has been used daily for a minimum of four weeks. First impressions are almost always optimistic. The coating wear, the software update that breaks functionality, the cramped grip fatigue that only shows up after hour three — none of that happens in a weekend.

Kill Your Darlings

When a product we’ve recommended publicly gets displaced by something genuinely better at the same price, we update immediately — even if the old pick was popular, even if it hurt our existing traffic. The review is for the reader, not the algorithm. We’ve done this four times since launch.

No Paid Spots Ever

Not sponsored posts. Not “featured partner” placements. Not affiliate deals that change which product we put at the top. Amazon Associates commission is the only revenue — and it’s identical across every product we could recommend. The link to the $45 mouse and the $180 mouse pay the same percentage. There is no financial incentive to push premium.

Hand-Shape Honesty

The right peripheral depends on grip style, hand size, and what you play. Every review specifies who it’s built for and who should skip it. A claw-grip player and a palm-grip player do not need the same mouse. We stop pretending that the spec sheet answers that question.

The Hype Audit

When a product goes viral — forum threads exploding, influencers uniformly positive, “best mouse of the year” posts everywhere — that’s when we slow down, not speed up. Hype distorts. We let the cycle peak before we publish, so we can tell you whether the consensus held up after the honeymoon.

Budget Gets the Same Rigor

The $35 option and the $180 option run the same test protocol. We’ve written reviews where the budget pick won — and published them without softening the conclusion to avoid offending the premium brand. A $40 mouse that outperforms a $120 mouse at the spec that matters to you is the correct recommendation.

What actually happens
before a review goes live.

This is the process generic affiliate sites skip because it takes too long and doesn’t scale. It’s exactly why ours are different.

We buy it ourselves.

Full retail price. No review units. No press samples. We get the same box the customer gets. If a product launches with limited availability, we wait in the same queue as everyone else.

We’ve returned 15+ products in the past year at a personal financial loss because they didn’t pass testing. That cost doesn’t go away. It’s what makes the process real.

28 days of actual daily use.

Not a long-weekend deep dive. A minimum of four weeks, with multiple daily sessions across different game genres — competitive shooters, MMOs, slower strategy games — to catch genre-specific issues that a single-game test misses.

We note specifically when long-term feel diverges from day-one feel. That divergence is in every published review.

Side-by-side comparison at the same price.

Every product is evaluated against at least two competitors at the same price tier. We don’t review a $75 mouse against a $150 alternative — we review it against other $75 options and tell you clearly what the extra $75 actually buys you if you’re considering moving up.

Sometimes the answer to “is the premium worth it?” is no. We say that.

Verdict with no hedging.

The review ends with a clear buy/skip call, specific “skip it if” conditions, and a direct answer to the question every reader actually has: “Is this the right product for how I play?” Not a score. Not a pros-and-cons table. A real answer.

If the verdict is “it depends,” we tell you exactly what it depends on — not just that it does.

If this is how you think
reviews should work —
you’re in the right place.

Start with the category you’re actually shopping for right now. Or read through how we think about a specific playstyle in the Build Your Loadout guides. You’ll know within five minutes whether this site thinks the way you do.