How control automation can unlock time for security teams ?
The last few decades have been marked by an unprecedented explosion of data. The amount of data generated and collected has increased exponentially, driven by factors such as the increasing digitization of information, the proliferation of connected devices, and the rise of big data technologies.
To illustrate this explosion, in 2010 the world was producing about 1.2 zettabytes (1.2 trillion gigabytes) of data per year. By 2020, that figure had risen to 59 zettabytes per year. IDC predicts that by 2025, we will be producing 175 zettabytes of data per day.
To put this into perspective, storing all the data generated in the world in 2025 on DVDs would require a stack of DVDs that could reach the moon 23 times or circle the earth 222 times. 🤯
The data explosion is not going to stop anytime soon. On the contrary, it is expected to accelerate in the coming years due to several factors.
Firstly, the digitalization of companies is becoming the norm, and the amount of data they generate and collect will continue to increase. Additionally, companies are increasingly turning their data and its exploitation into a competitive advantage. The growing use and improvement of AI and ML technologies is enabling companies to make data-informed decisions and drive innovation.
This high data generation is supported by cloud technologies that allow more data to be stored than ever before. According to IDC, 49% of the world's stored data will reside in public cloud environments.
Furthermore, as IoT technologies become more sophisticated and widespread, these devices will be increasingly present in businesses, especially factories, producing a significant amount of data.
We can also mention the particular case of the explosion of the use of SaaS in companies In fact, companies with more than 10,000 employees use an average of 447 SaaS solutions in 2021 . This has drastically increased the number of data sources and the type of data generated. As a result, it is rapidly becoming a nightmare to monitor and analyze data from multiple SaaS platforms.
Time-consuming data management
This phenomenon has created major challenges for companies: managing, using, and securing their data. The amount of data generated every day is so vast that it can be difficult for companies to keep track of it all. Furthermore, the company may not see and therefore cannot control some of this so-called shadow data , which makes it difficult for teams to have an accurate map of the company's data sources and control them.
Moreover, a significant amount of data is unstructured, meaning it is not well-organized and thus difficult to access and analyze. This problem is exacerbated by the fact that most of this data is generated by third-party sources.
As a result, teams must devote considerable time to normalizing the data so that it can be compared and analyzed, and conclusions can be drawn.
Another problem is the increased risk of data breaches and cyber-attacks that come with the proliferation of data. Hackers are always seeking ways to access sensitive data, and the more data a company has, the more vulnerable it is to such attacks.
To protect themselves from attacks, security teams will implement controls on specific tools. However, implementing these controls is time-consuming and requires qualified resources. Consequently, many teams spend a lot of time implementing numerous small controls that have minimal impact at the company level.
These factors mean that currently, 60% of teams' time is devoted to monitoring existing data, not to mention the new data sources being added every day and the shadow data.
To address this various challenges, it is beneficial to have numerous small automated sensors located throughout the company. This allows for easy access to data when incidents occur.
Security teams can easily install these sensors using APIs, avoiding the need to develop connectors for each new data source that is explored. This approach enables security teams to quickly control the number of new data sources that enter the company each day.
Next, security teams will need to establish controls for the new connected data sources and program them for automation. This saves significant time for the security team, as they no longer need to run checks multiple times. Instead, they can create several automated controls.
The saved time can then be used for other security tasks.
At Trout Software, we created Security Hub to empower security teams to scale their impact.
We understand the issues mentioned above and wanted to create a tool to free up security teams' time from time-consuming and repetitive tasks, allowing them to focus on higher value-added tasks.
Our product is built around three major pillars, with the final goal of enabling security teams to automate controls.
Easy data access
Security Hub, was built from the start leveraging the next generation of programming languages (Web Assembly).
This allows our clients to connect to any types of systems/environments, monitor and control them without the need to ingest any data (Normalization on read).
So you can connect to your different data sources via the connectors developed by Trout Software, or if you have your data on CSV files, you can simply drag & drop them.
Our no-code platform
Security Hub enables you to quickly investigate your data. You can easily normalize and parse your data, and get a structured overview of it, which makes your analysis more efficient.
After structuring your data, Security Hub offers pivot functionality that allows you to perform pivots by simply double-clicking the data you want to pivot on. Additionally, our tool lets you build queries quickly and effectively to investigate your data.
Finally, you can perform quick searches in your data via our search bar. These tools allow you to conduct your investigations more quickly and efficiently.
Furthermore, Security Hub allows security teams to easily deploy pre-built use cases that go from basic operations (cloud asset monitoring, SAAS observability) to more advanced like Machine Learning applications (anomaly detection, dynamic data anonymization etc…).
Automation & Control
After identifying interesting use cases, your team can make a control automation using Security Hub Scheduler.
This scheduler enables you to do some control automation based on various parameters such as the number of repetitions, time, date, and intervals between repetitions. With it, you can have precise and automated planning of your controls.
The platform continuously runs automation controls, monitors drifts, and keeps compliance records.
By using this framework, security teams can save hundreds of hours of manual work .
Uses Cases Trout Software :
When data is available throught API :
When data is available via API, Trout Software allows you to control for instance your workspace like Slack or Microsoft Teams.
If we take the example of Slack :
Read your data:
You can quickly access all Slack data through our interface using the Slack API.
Prepare the data :
To structure the data, you can perform parsing and normalization on demand. This will allow you to clearly separate people's IDs and names, identify bots, and delete them if necessary.
Once the data is structured, you can proceed with your analysis. For example, you can make a pivot on an ID or on admin people.
After completing your analysis, you can set up rules related to user admin people, for example.
Automate your controls:
After finishing your analysis in your notebook, you can automate control via the Security Hub scheduler. Depending on the periodicity you choose when setting up your notebook, Security Hub will periodically check that the rule for you set up for Slack admin people is being respected and will alert you if it is not.
When data is in files and tables :
Our goal is to improve monitoring of your IT perimeter. To achieve this, we provide the option of using the Security Hub from a file or table.
Let's take the example of troubleshooting systems :
Read your data:
To troubleshoot your systems, you may want to access logs stored in a S3 bucket. With Trout Software, simply create a connector towards that specific bucket and logs will be accessible in the platform, no need to ingest and normalize ahead of time.
Prepare the data :
To structure the data, perform an on-demand analysis and normalization. This distinguishes the different parameters related to the machine and makes the data file easy to read. After structuring the data, proceed to analyze it. You can search by keywords in the search bar (e.g. machine name) or pivot on a specific time. Once the analysis is complete, add a note to your notebook indicating how you solved the problem. This centralizes knowledge and adds context for future analysts.
Automate your controls:
Once you complete your analysis in your notebook, like the example with Slack, you can automate the check using the Security Hub scheduler. You can choose the frequency of the checks, or the specific day or date. If a deviation from the check is detected, you will receive an alert.
Want to learn more about cybersecurity ?
You can read our other blog posts such as :