Dan Isla is the VP of product at itopia. Before joining itopia in 2021, Dan spent four years at Google as a cloud solutions architect. Dan also worked at the NASA Jet Propulsion Laboratory building rovers and launching rockets. Dan is the creator of the open source project Selkies, which enables stateful workload orchestration on Kubernetes and per-user containerized environments with GPUs and WebRTC streaming.
Building a technology product almost always means raising capital. From angel investment to Series A-E growth capital to IPO, companies creating technology raise funds to make faster market moves, attract better talent and quickly reach the scale required for their product and business to realize full potential.
Before securing capital, investors conduct technical due diligence: a comprehensive assessment of the quality and maturity of a company’s product architecture, code base, security and the operating processes of its technology organization.
While the CEO is involved in this process, responsibility for technical due diligence often falls on the CTO, VP of engineering and other key leaders of the technology organization.
By being prepared for this process, a startup can instill greater investor confidence and secure funds more rapidly.
Having navigated technical due diligence multiple times, we’ve distilled our experience into five ways that companies can ensure they’re prepared for the road ahead. In this article, we won’t address the quality of a given product or code base, but we’ll focus instead on best practices, tools and suggested processes deployed by companies that have successfully navigated technical due diligence over the past 12 months.
1. Streamline Developer Onboarding and Offboarding
Onboarding new developers is hard.
Ask any developer about their first week on a job and they’ll lament the pain of setting up developer environments before they can even begin to code. Most development is still done on local machines, and it can take weeks to ship a machine to a new hire or upgrade an existing one.
Once the machine arrives, developers must configure it with all the tools and dependencies they need. The list of items to configure is long, and local environments are resistant to configuration , given multiple dependencies across the local environment.
Sponsor Note
itopia makes hybrid work easy for software teams by delivering containerized developer environments in a browser. Companies onboard developers fast and prevent exfiltration with precise security controls. Devs launch spaces with all their tools pre-installed and start coding in seconds. itopia and TNS are under common control.
Read the latest from itopia
Personalization of the environment is also important. Coding is both an art and a science, and developers are most productive when personal preferences for keyboard shortcuts, key bindings and environment aesthetics support their individual styles of coding. Onboarding should efficiently support personalization for each developer.
Offboarding developers also poses challenges. While companies are doing everything they can to retain valuable developers, the reality is they’re in high demand across virtually every industry, and they change jobs frequently. According to the 2021 Bureau of Labor Statistics , the average turnover rate for a software developer is 57.3%.
As developers join and leave teams more quickly than before, companies need to move with agility to revoke code access or ramp up a replacement developer.
The new normal of hybrid work has revealed a need for dev team administration tools that allow engineering managers to provision and monitor developer access via a portal, securely onboard and offboard developers regardless of device or location, and maintain consistent environments across distributed endpoints.
In technical due diligence processes, fast, hardware-agnostic developer onboarding results in happier, more productive dev teams that maximize time spent on coding. Efficient offboarding shows that a startup is mature enough to protect its interests when a developer or contractor leaves the organization. Proving you’re on top of both processes conveys to potential investors that your team is ready to scale into a larger organization.
2. Secure Intellectual Property
Ninety-five percent of organizations surveyed by the Linux Foundation in 2021 said they were concerned about software security, for good reason. In its anonymized analysis of 700,000 devices, Code42 saw a 61% increase in data exposure events in Q2 2021, 11% of which were related to source code exposures that coincided with job shifts in the economy.
The cost of a data breach across all types of sources increased by 10% in 2021 to $4.24 million, according to IBM Security — and that figure was $1.07 million higher when remote work was involved because it took longer to detect and contain breaches.
Data exfiltration, whether intentional or unintentional, is a growing organizational threat that is increasingly difficult to handle as teams become more distributed. Development teams need new ways to manage exfiltration risk without compromising productivity.
Companies can guard against exfiltration with better documentation of their security vulnerabilities, proactively highlighting areas of weakness and making them more traceable if problems occur. Open source technologies can be more flexible, secure and auditable than closed technology, so its growth in usage is not surprising. But there are downsides too. Tools like Snyk’s open source vulnerability database and the web security testing guide from the Open Web Application Security Project (OWASP) can help developers keep tabs on and test for weak links in code security.
The National Institute of Standards and Technology (NIST) is also promoting the development of a software bill of materials (SBOM): “a formal record containing the details and supply chain relationships of various components used in building software.” NIST further explains, “the intent of SBOMs is to provide increased transparency, provenance and speed at which vulnerabilities can be identified and remediated by departments and agencies.” Based on its 2021 survey, the Linux Foundation expects SBOM use to increase from 47% in 2021 to 88% in 2023.
Development teams can also take advantage of layered security approaches, beginning with local machines. Endpoint isolation controls — limiting clipboard access, printing and file transfer — can accomplish this while also protecting personalization settings. And mainstream use of “zero trust” policies and tools are also accelerating. With development teams increasingly distributed, traditional security features designed to protect against intrusion on colocated assets don’t effectively secure development work today. Companies need to protect against exfiltration from devices in many locations, and zero trust better supports decentralized security while maintaining productivity. For example, zero trust helps to automate security capabilities , reducing the risk of manual error and making policies more flexible and adjustable as security needs change.
IBM Security’s Cost of a Data Breach 2021 report notes that companies using security AI and automation incurred less than half the cost of data breaches of those that didn’t because they were able to detect and contain the breach much sooner.
During technical due diligence, zero trust security practices, endpoint isolation and a culture of acknowledging and documenting vulnerabilities show potential investors that a team is equipped to solve this, no matter how dramatically the playing field changes in the future.
3. Version-Control Everything
In the past decade, we’ve seen the rise and standardization of “as code”: Infrastructure-as-Code, Monitoring-as-Code, Policy-as-Code and soon perhaps Data-as-Code. “Stuff-as-code” means statelessly automating the management of “stuff” via version-controlled, declarative configuration files.
Version-control systems like Git, Perforce and Subversion automatically organize files and coordinate their creation, editing and deletion across a team. While all dev teams already use version control for source code, most don’t embrace it for infrastructure and application configuration files, deployment scripts and other “nonsource configuration stuff.”
Teams that use version control are more likely to have mature processes for cloud native development practices, such as continuous delivery. This is because teams using version control can reproduce any environment automatically, access historical environments and query the versions of each library or dependency used by older applications and legacy code.
One valuable category of “stuff” to version-control is developer-environment configuration. The common problem of developer-environment configuration drift is a minor inconvenience when revisiting old projects or following outdated tutorials, but across a professional dev team, it can cost each developer days of progress every month to restore/update their environment(s), especially on a technologically diversified team. Version control can reduce that time by providing your team members with a foundation for productivity, and version pinning can prevent the insecure drifting of your dependencies.
Versioning your config files (dependencies, infrastructure, dotfiles, etc.) gives your developers a safer path through the dangerous woods of bottom-up security decision-making. And it signals to a potential investor that a dev team is operationally mature. For hybrid software teams collaborating from multiple locations, versioning often implicitly grants a review process for software decisions, helps you understand your security attack surface and smooths the onboarding/collaboration process for team members.
4. Use Synthetic Data for QA and Staging Environments
First, there was data. Then there was big data. And today, with artificial intelligence (AI) and machine learning (ML) there is synthetic data.
Synthetic data mimics a company’s production data so testing and staging environments mirror real-world data without compromising user privacy. This enables companies to share data safely while retaining signals from its real data without the risk of using actual data in testing and security and disaster recovery drills.
Synthetic data is relatively new, but it is being adopted rapidly from companies like Tonic.ai, Gretel.ai, Hazy and Mostly AI. Gartner predicts that ML and analytics platforms will be expected to generate synthetic data in 2024.
In industries where customer data is extremely sensitive or regulated — think banking, public sector, health care and telecommunications — there’s an obvious need for synthetic data for safety and compliance in areas such as data sharing, API development and proactively finding or addressing security vulnerabilities.
Synthetic data is increasingly playing a role in AI training when massive amounts of real data is needed to train and optimize algorithms. Because algorithms can show bias or imbalance when new data or patterns emerge, Gartner predicts 85% of AI projects will deliver erroneous outcomes due to bias in data through 2022. Synthetic data can intentionally create rare patterns to help AI and ML technologies perform more accurately and correct for bias.
For companies entering tech due diligence with AI or ML in their products, synthetic data indicates maturity in testing, compliance, data objectivity and a readiness for technology partnerships and data sharing that often follow a large round of investment. And with PwC finding more than 870 AI-related merger-and-acquisitions transactions and public offerings between 2016 and 2020, it’s likely to become the mainstream.
5. Containerize Developer Environments
Maintaining and securing developer environments for remote work is challenging. Dependency hell and tales of “works on my machine” create reliability problems. Containers and web-based integrated development environments (IDEs) solve these problems by applying DevOps principles to the development environment.
Containerized development environments are not a new concept, but they require additional technology to orchestrate and deliver them at scale. Traditional open source streaming technologies like Virtual Network Computing (VNC) lack the native web integration and ability to deliver high-definition streaming. The OSS Selkies project was started at Google and uses Kubernetes, Istio, GStreamer and WebRTC to deliver high-definition, high-frame-rate streaming experiences to the browser. itopia is now the primary maintainer of this project and we welcome new contributors.
Developer environments are a great use case for applying DevOps principles to the development process, starting with the IDE. By delivering the IDE and all of its dependencies via a web browser with containers, the environment becomes consistent, manageable and more secure for remote workers. Web technologies automate complex authentication flows and deliver a portable, low-latency, graphics-intensive experience for developers.
As remote work increases, environment portability, security and code isolation become more important. Fully managed services like itopia Spaces deliver IDEs in a web browser. With the entire development build toolchain and any Linux-based IDE available, enterprise IT admins can create containerized templates for developers to ensure a consistent environment across teams and projects. Spaces also automates the security configuration, network peering, storage and single sign-on identity systems to integrate with large enterprise systems.
Containerizing developer environments also supports the technical due diligence process. Besides showing investors that a company has mature onboarding and security procedures, due diligence processes often require demoing previous code versions or legacy applications. This shows the value of a company’s past technology, which the company may no longer be running in production. Containerized dev environments make it easy to demo this technology with dependencies already in place, speeding up the process and building credibility with potential investors.
Ready for Next-Level Funding?
Taken individually, these five considerations represent the application of best practices to your development-related processes and environments. But to a potential investor, they’re ample evidence of an organization that is ready for next-level funding.
These practices can be deployed at any scale, regardless of where your organization is on its maturity pathway. And that can make all the difference to a keen-eyed investor who is evaluating your organization as a prospective opportunity.
The New Stack is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: Tonic.