
From Shooting Ranges to Computer Vision: How an iOS App Replaced Traditional Brass Plugs in Scoring
This article explores a unique intersection of traditional marksmanship and modern technology. The author, driven by a desire to hunt and cook venison from scratch, recounts their experience at a shooting range near Edinburgh. To streamline the tedious and manual process of scoring targets—which traditionally involves using physical brass plugs to determine shot placement—the author developed a sophisticated digital solution. By porting a 2012 OpenCV paper and training a state-of-the-art YOLOv8 computer vision model for iOS using CoreML, the author successfully automated the scoring ritual. This transition from manual inspection to mobile AI highlights the practical application of computer vision in niche hobbies, even if the technical overhead briefly delayed the ultimate goal of a home-cooked venison dinner.
Key Takeaways
- Manual Scoring Challenges: Traditional target scoring requires physical brass plugs to determine if a shot touches a ring line, a process prone to physical strain and repetitive manual checks.
- Technical Integration: The project involved porting a 2012 OpenCV paper and training a YOLOv8 model to automate visual recognition on iOS via CoreML.
- Motivation through Culinary Passion: The drive to learn shooting was rooted in a deep obsession with cooking from scratch, including charcuterie and making garum from grasshoppers.
- Environmental Context: The development took place amidst the specific conditions of a shooting range in Edinburgh, characterized by low ceilings and traditional scoring rituals.
In-Depth Analysis
The Transition from Brass to Bits
In the traditional setting of an Edinburgh shooting range, scoring is a meticulous "ritual." When a shot lands near a ring line, shooters must use a tray of brass plugs of various sizes. These plugs are inserted into the bullet hole to determine the final score based on where the flange sits relative to the target rings. The author identified this process, along with the physical hazards of the range—such as low ceiling beams—as a bottleneck in their journey toward learning to hunt. To eliminate the "score-counting-head-hitting-plug-pushing ritual," the author turned to mobile technology.
Engineering the Computer Vision Solution
The technical execution required bridging over a decade of research with modern mobile hardware. The author utilized a 2012 OpenCV paper as a foundational logic for target analysis and combined it with a state-of-the-art YOLOv8 (You Only Look Once) model. By leveraging Apple's CoreML framework, the model was optimized to run on an iPhone. This allowed the device to perform the task of the brass plug—detecting shot placement and calculating scores instantly through the camera lens, effectively digitizing a physical verification process that has remained unchanged for decades.
Industry Impact
This project demonstrates the increasing accessibility of high-end computer vision tools for individual developers. By successfully deploying YOLOv8 and OpenCV on a mobile device to solve a specific, niche problem like target scoring, it highlights the potential for AI to replace specialized physical tools (like brass plugs) in various hobbyist and professional fields. It also underscores the trend of "edge AI," where complex model inference is moved directly onto consumer hardware to provide real-time utility in environments without specialized equipment.
Frequently Asked Questions
Question: Why did the author decide to build an app instead of using brass plugs?
The author found the manual scoring process tedious and physically demanding, often involving hitting their head on low beams while checking cards. The app was designed to end the "ritual" of manual plug-pushing and speed up the progress tracking required for their hunting training.
Question: What specific technologies were used to create the scoring app?
The developer used OpenCV (based on a 2012 research paper), trained a YOLOv8 computer vision model, and implemented the solution on iOS using the CoreML framework.
Question: What was the ultimate goal behind learning to shoot?
The author's primary motivation was culinary. As an obsessive cook who makes their own garum and charcuterie, they wanted to learn to hunt to source and butchering whole venison carcasses from scratch.
