({{ request_quote_modules_count }}) Demander un devis

Envoyez-moi une offre:

GI.blog Test and Measurement Insights
Astuces & tendances

June 25, 2019

3 Trends in testing you can’t ignore any longer

Testing is an agile business by its core definition. The structures and systems to be tested, whether it is in the automotive, aerospace, energy or civil engineering segment, keep evolving quickly and the data acquisition system needs to keep pace with it. At the same time not every new, hot trend will change your tomorrow.

You probably know that already, because you have seen it over and over again, but you don’t want to jump on every bandwagon each time. And yet there are developments in data acquisition that are here to stay and will have an impact on your test setup and the way you work. If not today, then tomorrow. So let us introduce you to what we think are the top 3 trends in testing that will not go away and are worth a closer look.

1. Speed

The faster the better. Or let’s say the faster you measure, the more data and the higher the resolution. High speed measurement is an emerging trend, not only in the automotive industry, but also in the growing market of electric drives. Whether it concerns testing of (hybrid) electric propulsion systems , electric drivetrains or batteries – to test and measure electrical parameters faster and with lower noise is a trend you should definitely keep an eye on.

If you are familiar with high speed measurement, then you are familiar with the big challenge that comes along with it – the avalanche of data. The faster you measure, the higher is the data volume that needs to be handled and turned into meaningful insights.

Therefore, pay attention to data acquisition solutions that offer big data analytics, data management, and data storage technologies that are easy to adapt to the changing requirements for testing.

2. Smart Data Handling

To work faster and more efficiently you want to be able to monitor and respond to data in real-time, regardless of the data volume. Depending on the type of measurement, the duration and sample frequency, an overwhelming avalanche of data will hit you.

Your challenge ahead is not only to collect the data, but to store and analyze it in the most reliable way. To reach this goal you will need a solution that offers a faster and more efficient handling of large data streams.

One way a smart data backend can operate with is for example to distinguish between hot and cold data and to handle those types differently.  Raw data and data that is less-frequently accessed and only needed for auditing or test post-processing (“cold data”) is stored in a distributed streaming platform that scales extremely efficiently. If you have to store, process and calculate new variables from hundreds of thousands of samples per second and from hundreds of channels at the same time, this distributed streaming architecture will show its strength and power.

So-called “hot data”, measurement data that must be accessed immediately for analysis, is provided in a time series database. This database stores data securely in redundant, fault-tolerant clusters. All measurement data is automatically backed up. Flexible data aggregation ensures that measurement data is continuous processed from the streaming platform to the database at predefined sample rates.

However, the same data can be replayed and stored at a higher sample rate in case detailed analysis around an unexpected event or specimen failure is required. This approach minimizes the investment cost for IT and storage infrastructure in the test lab, whilst maintaining the necessary computing performance for test-critical data analysis tasks.

No matter the exact solution, to guarantee your smooth surf on any data avalanche ahead you will rely on a smart data backend, that contains services for connectivity and is adaptable and scalable for high-performance edge computing services.

As a sweet side benefit a distributed and scalable data backend offers even more control over your cost-performance ratio as you can access your test setup and data from anywhere around the globe. Your engineering team might not be in one place or your biggest client will need support quickly, no matter the time zone: All of this can be easily provided through  scalable data backends.

3. Modularity

Speaking of cost-performance ratio – this trend is the holy grail we are all looking for during our working life. Keep running costs down while delivering the best performance. But how can you succeed in doing so in today’s fast-paced market? The key to achieve an optimal cost-performance ratio in testing is called: modularity.

A modular approach to your test equipment offers you scalability and flexibility. Maintaining a basic hardware and software platform that can easily be extended with new technologies like high-speed or fiber-optic measurement, reduces capital investment. Look out for modular solutions that are backward compatible and will therefore extend equipment life and lower maintenance cost.

 

Conclusion

The 3 trends you can´t ignore any longer all point towards the same direction – the structures and systems to be tested change quickly and your data acquisition system needs to keep pace with it. Big data analytics and a modular approach to your test set up will clear the way to meet your future needs in testing and measurement.

Find out how we set up a testing of a 48V Battery Management System involving 15 DAQ-systems with a total of up to 60 input channels each.  Measuring voltage and temperature on the lithium-ion cells with an interface to the clients DAQ software using Modbus, Ethernet and EtherCAT.

Author: Juergen Sutterlueti

Juergen Sutterlueti is Gantner Instrument's Vice President, Energy Segment and Marketing.

Vous avez des questions ? Contactez nos experts en technologie de mesure :