“Made by usd HeroLab” – Sebastian Puttkammer about Tools, Quality and Efficiency

4. February 2020

Driven by the motivation to simplify the work for all team members, a team under the leadership of Sebastian Puttkammer, Managing Consultant at usd HeroLab, developed tools “made by usd HeroLab”. We asked what developments the recent years have brought and how they contribute to increasing the quality and efficiency of the usd HeroLab.

Sebastian, you have developed many of the HeroLab Tools yourself. How come?

Sebastian Puttkammer: In the past, we had to do a lot of research during a pentest and were often faced with the challenge that, for example, we had to manually retest a vulnerability for the same number of times that it had been found during a pentest. We already had self-written scripts that supported us, but they were very static and complex. So we started thinking about how to optimize the workflow for the whole team.

What were your initial ideas?

SP: The self-written scripts performed special security checks with existing tools. However, the script collection could only be expanded or adapted to a limited extent and was very complex in structure. So we started to automate our existing pentest tools. This was the beginning of the “Icebreaker”.

Can you briefly explain how the “Icebreaker” works?

SP: The “Icebreaker” is a platform that combines all of our self developments and the best public tools. Either the analysts themselves or our development team integrate and document our research results in the form of plug-ins or tools. This way we generate a constantly growing knowledge database, which enables the automated detection of current vulnerabilities. The analyst then verifies the complete tool output on the dashboard. We strive not to generate any false positives. Since the standard checks are covered by the “Icebreaker”, we have more time for complex, manual analyses. It also guarantees that the environment is pentested with a consistently high level of quality. This creates a huge gain in efficiency and quality. For us and for our clients.

How do you ensure that the “Icebreaker” consistently meets HeroLab quality standards?

SP: The “Icebreaker”, like all our self-developed tools, is constantly tested and further developed by our development team. For the functional tests we use our own tool, which automatically hosts vulnerabilities. This is the best functional test one can do. By the way, the test tool has its origins in the Summer/Winter School, the training program for our working students. This creates really cool synergy effects: Our colleagues from the training team are happy that new vulnerabilities are added for the students and the development team is happy that the tool can be used for functional testing.

Can all HeroLab tools be integrated with each other?

SP: The tool “ExPeRT” is our big interface where all tools converge. The results from the “Icebreaker” are transferred to “ExPeRT” via an interface. With the help of the overview, the project manager always stays up to date on the current project status and can assign tasks to teammates. . Questions and comments are displayed in “ExPeRT”, enabling efficient project collaboration. This is especially important when several colleagues work together on a project. In addition, “ExPeRT” displays all necessary steps in interactive checklists, which ensures that no step is overlooked by an analyst.

How do you then prepare the pentest results for your clients?

SP: Our module team creates a text module for each standard vulnerability, which is continuously updated, quality assured and imported into “ExPeRT”. In other words, if “Icebreaker” automatically detects a vulnerability, it is immediately transferred to “ExPeRT”. The analyst then adapts the text module with corresponding recommended measures according to the individual project, completes the management summary and summarizes the essential recommended measures.

Today about 80 colleagues work with your tools at the HeroLab. How does it feel to know that?

SP: Everyone here is a full-blooded analyst and is naturally suspicious of the quality and the security of tools. We only use tools that we have tested ourselves via Pentest and Code Review. With such a user base it makes us quite proud to see how well the solutions are accepted and that they contribute significantly to the quality of our work. This is a real motivation to keep going. That’s why we’ve been working intensively on creating the next member of our tool family (laughs).

Also interesting:

Charity Runs 2023 - A Statement For Diversity And Solidarity

Charity Runs 2023 - A Statement For Diversity And Solidarity

This year, charity runs took place once again throughout Germany. The runs not only offer the opportunity to keep fit but also set a statement for diversity and solidarity. Organized by the usd Responsibility Circle, we supported our colleagues in their joint...

The Top 3 Security Aspects of Pentests in Automotive Cyber Security

The Top 3 Security Aspects of Pentests in Automotive Cyber Security

Connected Vehicles: Infotainment. Autonomous Driving. Cloud Backend. Amidst these developments, new opportunities are emerging for businesses, but also entirely new attack paths for cybercriminals. At the same time, they pose new challenges for cybersecurity...

NIS-2 and Dora: Why Two Pieces of EU Cybersecurity Legislation?

NIS-2 and Dora: Why Two Pieces of EU Cybersecurity Legislation?

Within a few months, the European Union has published two important pieces of legislation to strengthen cybersecurity: NIS-2 and DORA. Both are intended to strengthen companies in the financial sector and other businesses that are critical to the economy and society...