A compromised drill-hole database can invalidate an entire resource estimate — yet database quality assessment rarely receives the attention it deserves in standard due diligence workflows. Here is the rapid-triage checklist I run in acQuire (or equivalent) before I open a single compositing spreadsheet.
The thirty-minute rule is not about comprehensiveness — a full database audit for a significant resource estimate takes days. It is about identifying the issues that, if present, will fundamentally alter the scope and cost of the work ahead. Finding a critical data integrity problem early means you can negotiate the engagement accordingly; finding it after you have built a block model is expensive for everyone.
The Rapid-Triage Checklist
The first thirty minutes covers six checks, in roughly this order:
- Collar coordinate system and projection verification
- Downhole survey coverage and consistency
- From-to interval continuity in the assay table
- QAQC insertion rate and performance summary
- Assay lab change history
- Lithological and geological coding completeness
Each of these checks takes three to five minutes with a properly structured database query. What follows is what I am specifically looking for in each.
Collar Survey Inconsistencies
Collar coordinate errors are among the most consequential database problems because they propagate through every subsequent operation: downhole desurveying, compositing, variography, and kriging. A 10-metre collar error in a system with 40-metre drill spacing will materially distort the spatial model.
My first query compares collar coordinates in the drill-hole database against an independent source — typically the project's GPS survey files or the QP's field notes from a site visit. I flag any hole where the coordinate discrepancy exceeds 5 metres in plan or 2 metres in elevation.
The second collar check examines azimuth and dip at collar against the downhole survey measurement at the first survey station. Discrepancies greater than 5° in azimuth or 3° in dip suggest either a data entry error or that the collar was not surveyed at the time of drilling — a common problem in older programmes where collar surveys were retroactively estimated.
If more than 5% of collars lack an independent survey record, the entire coordinate dataset should be treated as suspect until field verification is completed. This is a scope-altering finding that should be escalated immediately.
Assay Data Red Flags
Gap and Overlap Analysis
The most common assay table integrity problem is gaps and overlaps in the from-to interval record. A gap is an unsampled interval within a drill hole; an overlap is an interval that appears to have been sampled twice. Both can be systematic (a contractor's consistent sample-splitting practice) or random (data entry errors).
I run a simple query summing (TO - FROM) for all intervals per hole and comparing the result against the hole's maximum depth. Any discrepancy greater than 0.1 metres flags the hole for detailed review. In a well-managed database this query should return zero flags; in practice, projects that have been through multiple contractors or data management systems often flag 5–15% of holes.
Assay Distribution Outliers
Extreme high-grade assay values — typically defined as values exceeding the 95th percentile by a factor of three or more — require individual verification. I generate a list of the top 20 assay values in the database and cross-check each against the original laboratory certificate. Certificate-to-database transcription errors are rare but occur, and a single uncaught transcription error in a high-grade outlier can inflate a resource estimate by 5–10% on contained metal.
"The highest grade assay in the database should always have the most documentation, not the least. When I find the opposite, I slow down."
Downhole Survey Issues
Downhole survey coverage determines the accuracy of three-dimensional hole path reconstruction and, by extension, the accuracy of every spatial relationship in the block model. I look for three specific problems:
- Survey station spacing exceeding 30 metres in steep or strongly foliated ground. In structurally complex terrain, holes can deviate significantly over 30-metre intervals. Sparse surveys systematically underestimate the true deviation.
- Single-shot (magnetic) surveys in areas of known magnetic interference. Magnetic surveys are unreliable within 50 metres of magnetite-rich mineralisation or mafic intrusions. Any project with significant iron formation or mafic host rocks should have gyroscopic survey coverage for the critical resource intervals.
- Survey records that end more than 20 metres above the bottom of hole. Unsurveyed tail sections produce reconstructed hole paths that diverge from reality at depth — exactly where many deposits have their highest-grade core.
QAQC Summary: What Passes and What Fails
The industry standard for QAQC insertion rates is: certified reference materials (CRMs/standards) at 5% of all samples, blanks at 5%, and field duplicates at 5%. Projects below these rates require explanation.
More important than insertion rate is the pass/fail record. I calculate the failure rate for each standard lot (the percentage of insertions that fall outside the ±2 standard deviation acceptance window) and flag any lot with a failure rate exceeding 10%. Systematic standard failures indicate either a laboratory accuracy problem or contamination in the sample preparation circuit — both of which require investigation before the assay data can be relied upon.
Blank failure rates above 2% indicate contamination in the sample preparation or assay circuit. For gold projects, a contaminated circuit is a serious data integrity problem because it introduces positive bias that cannot be corrected retrospectively.
What to Do When You Find Problems
The appropriate response to database red flags depends on their severity and pervasiveness. My decision framework:
- Isolated, verifiable errors (≤2% of records): correct from source documents, flag in the technical report, proceed with the estimate.
- Systematic errors affecting a specific contractor period or lab: quarantine the affected data, perform a sensitivity analysis with and without the quarantined intervals, and disclose the uncertainty in the resource classification.
- Pervasive data integrity problems (>10% of records affected, no verifiable source documents): recommend against completing a resource estimate until the database has been independently audited and re-validated. This is a stop-work recommendation, and it is the right call.
Under NI 43-101 Item 11, the QP is required to state their opinion on the adequacy of the data for the purposes of the resource estimate. "Data are adequate" is a professional opinion that must be earned through verification, not assumed.
JNA Resource Advisory conducts independent drill-hole database audits as a standalone service and as part of full technical due diligence engagements. Contact us to discuss your project.