Helmet Impact Testing: Blitzing on a Budget

Helmet Impact Testing

What happens when your head hits a wall at 18 MPH? Nothing good, we can assure you, and yet those kinds of impacts are a daily occurrence in American football.

Sports medicine professionals and research organizations continue to raise awareness and educate athletes about the risk of concussive brain injuries and the limited protection of helmets.

With the mounting attention drawn to this issue, the NFL even created an engineering challenge to help drive the development of safer helmets.

NFL Helmet Challenge
RESOURCE CREDIT: NFL.com | Innovation Challenges

Not long ago, when a client came to us with a concept for an improved helmet, we saw it as an exciting opportunity to make a positive impact on this serious and difficult to solve issue.

Project: Engineering Support to Tackle Helmet Impact Testing

The concept was innovative: develop a helmet with a unique liquid damping system designed to minimize traumatic brain injuries in high-impact sports.

However, when we started, the project was full of questions and ambiguities. Very little was known in advance, except we were to refine the client’s existing helmet impact test ideas into something that performed consistently and could be manufactured in large quantities.

Ultimately, we would need to solve a number of complex and concurrent objectives under significant budget and time constraints:

> Design an innovative new product

> Develop an optimal test protocol

> Design and build an effective, low-cost test fixture

> Evaluate performance using test data

Tackling ambiguous challenges requires a flexible mindset

From decades of experience helping clients solve complex engineering challenges, we knew that one of our best assets when starting this type of work is to have a flexible mindset. This is especially true when juggling competing objectives, multiple constraints, and ambitious concepts that have a fair amount of ambiguity.

So, we did what we do best: we embraced the ambiguity! We dove in and tackled what was possible each day, and then re-evaluated the next day. This iterative problem-solving approach helped us start small but move forward quickly by applying what we learned.

Testing was a major part of the project. We were constantly building different prototypes, sometimes several a day, and they all needed to be tested.

Let’s look at how testing unfolded and what we learned about developing the best solutions.

 

Challenge: Affordable Helmet Impact Testing with Accurate Analysis

Our client was a small startup with limited financial resources. We couldn’t simply use a large, commercial impact tester costing thousands of dollars, otherwise the entire project budget might have been consumed.

We needed something relatively inexpensive, with the flexibility to be modified and reconfigured as the project evolved.

Plus, since the project schedule happened to occur at the height of the COVID work-from-home era, our test fixture also had to fit in the basement of our test engineer’s home!

Enabling modular helmet impact testing

We chose to build a low-cost test fixture with T-slot aluminum framing because it’s easy to assemble and reconfigure. We ordered all the pieces pre-cut from an industrial supplier and assembled them in an afternoon.

The basic setup was quite simple:

> We used gravity to slam a weight into the liquid damping system.

> For the falling mass we used a standard linear bearing carriage, designed to run directly on the T-slot frame.

> A load cell was positioned at the base to capture the impact of the falling mass hitting the prototype damper.

helmet test fixture

Exploring viable software, data capture, and analysis

The real challenge, though, wasn’t the mechanical design.

To evaluate each damper, we needed to make accurate measurements of force and displacement during the impact.

> The peak forces were often larger than 4500N (1000lb.) and impacts were brief, just a few milliseconds over a distance of less than 30mm.

> To construct a detailed force-displacement curve, we needed to make hundreds of measurements during that time.

> We also wanted to capture high-speed video of each impact to help diagnose the behavior of each prototype.

Because of the various project ambiguities, we decided to start out with a minimally viable test setup, and then upgrade it as we learned more.

The force measurements were straightforward:

> The dampers were placed on a small platform at the base of the tester, and that platform was mounted directly to a load cell.

> The load cell signal was amplified and then recorded in a low-cost data acquisition system (DAQ).

> The DAQ came with a software API, so we wrote some simple Python code to pull the force data and dump it into a spreadsheet.

Experimenting with high-speed video to measure distance

We wanted the displacement measurements to be non-contact, because we didn’t want the measurement process to interfere with the impact.

After considering several alternatives we decided to experiment with a video-based method. We were already planning to capture high-speed videos, so perhaps we could also use those videos to measure distance?

We had a digital camera already on hand (a mid-range SLR) which was able to take videos at 960 frames per second. It wasn’t a true high-speed camera, but it was good enough for us to start with and didn’t cost anything.

To extract distance information from the videos, we used the open-source Kinovea software.

helmet testing software
RESOURCE CREDIT: Kinovea.org

This package was originally developed for sports analysis, but it works well for all kinds of video tracking applications.

> In Kinovea, we picked a target point at the start of each video: a colored dot attached to the falling weight.

> The software then tracked that dot, frame by frame, and calculated how far it had moved each time.

The video-based method worked surprisingly well, considering it was something we cobbled together with little to no expense.

It allowed us to measure the performance of our early prototypes and steered several subsequent rounds of design iterations.

Upgrading required for quicker data analysis

But as our helmet impact testing and design work increased, the shortcomings of the video method became more obvious.

The biggest issue was the post-test data analysis, which required several steps:

> Import the video from the camera into the software

> Analyze the video in the software, which involved several manual steps each time

> Export the distance vs. frame data from the software and the force vs. time data from the DAQ

> Bring both datasets into a huge spreadsheet

> Convert camera frame data into time data and then manually align the start point of the two datasets

> Correct for the errors created by tiny camera movements as the impact energy travelled through the concrete floor and shook the camera

> Interpolate the distance data, because the DAQ was measuring 50X more often than the camera

> Plot the force vs. distance curve

Even under the best of circumstances, this process for data analysis took 10-15 minutes for each run. The resulting plots were only of medium quality, because of the interpolation and manual alignment steps.

We had learned a lot, but now it was time to upgrade the measurement system.

Solution: Linear Encoder Measurements with Python Data Analysis

Achieving greater precision with our measurements

Our distance measurements improved dramatically when we replaced the video-based method with an off-the-shelf linear encoder.

> The encoder had two parts: an adhesive strip affixed to the falling weight, and a stationary sensor mounted on the frame.

> The adhesive strip had a pattern of tiny magnetic stripes; each stripe generated a pulse in the sensor as it passed by.

> The encoder was designed for precision motion control, so it had the ability to measure the position of the weight within ±1.2µm, at speeds up to 9 m/s. A huge improvement in both accuracy and precision.

test fixture mounting

Automating data handling and analysis with Python

The new encoder required an upgrade to the DAQ hardware, but that came with another huge benefit.

We were now using the DAQ for both the force and displacement measurements, so we expanded the Python code to import both datasets from the DAQ and do all of the analysis.

Python is a perfect tool for this kind of work. It’s relatively easy to learn, at least compared to many other programming languages, and there are thousands of existing libraries to easily perform all kinds of scientific and engineering calculations. For example, when we wanted to remove noise from our data, all it took was a single line of code to implement a low-pass Butterworth filter.

The Python routine automated the entire data analysis process:

> Read raw values from the DAQ, and convert them to more useful units

> Apply the noise filter

> Detect the start and finish of impact

> Calculate peak force, peak velocity, and other useful parameters

> Plot force vs. displacement and save it as PNG file

> Save the raw and processed data to a CSV file

Scoring big with data analysis speeds, down from 15 minutes to <5 seconds

The Python code dramatically accelerated our testing (no pun intended!) and enabled a process of rapid, iterative engineering that led to several creative design elements.

What previously took 15 minutes with several manual steps was now happening automatically in less than 5 seconds!

Force-displacement plot from Python routine
Force-displacement plot from Python routine

Our solution came at a perfect time during the project, when we had made some major design decisions but needed to test dozens of different variations to optimize performance.

Winning Strategy: Maximum Flexibility to Quickly Apply Learnings

As we described earlier, ambiguity requires engineers to embrace the unknown in order to move quickly and discover the best solutions.

If we had approached this project with a more rigid product development process, we would not have had the freedom to make major, on-the-fly changes, applying what we learned at each stage about the product and the testing.

Similarly, it would have been foolish to try and build the “perfect” test setup on Day 1. Doing that would have likely led us to build the wrong thing, since we didn’t really understand what we truly needed.

Maintaining flexibility while working through each objective proved to be crucial for this helmet impact testing project.

What previously took 15 minutes with several manual steps was now happening automatically in less than 5 seconds!

How can we help you tackle your test engineering challenges?

Are you looking for a full range of custom hardware testing services?

Do you need help with verification and validation or failure analysis of your product?

With technical expertise in mechanical, electrical, and firmware testing, AC Product Development and Test Engineering teams are uniquely equipped to understand your product.

We move quickly to customize test protocols and create test fixtures for data collection, analysis, and design recommendations.

How can we help?

Explore: Learn how Virginia Tech publishes the results of their research on head impacts to generate helmet ratings for football and other categories of sport

helmet testing
RESOURCE CREDIT: Virginia Tech. Helmet Ratings