Beyond the Hype: Open Source, AI Reality, and Sustainable Innovation in Chip Design

Felix Klein, PhD

Reading time: 18

Table of contents

+
-

  1. Open Source is a way of working
  2. AI: Separating Signal from Noise
  3. Other Hypes: What’s Real and What’s Not
  4. Building a Team for This Approach

Open Source is a way of working

QBayLogic has built its business on a foundation of open source tools, future-proof working methods, and a clear-eyed view of which technologies actually deliver value. We sat down with Felix and Christiaan to discuss how their commitment to open source shapes everything they do, where AI genuinely helps (and where it doesn’t), what’s happening with emerging technologies like quantum computing and photonics, and why their distinctive working style both attracts talent and keeps projects on track.

Let's start with open source technology. Can you describe what working with open source tools means for the way you approach your projects?

Felix: First of all, it partially dictates who we work with. For one client in cybersecurity, it’s a requirement. They want to make sure that maintenance is long-term and that the technology can’t fail because… well, companies and organizations have learned that tools may become legacy after 10 or 20 years. They want to invest more long-term. They see that things are longer lasting with open source.

Christiaan: For Google, for example, it’s practical. They want tools where they’re not dependent on one party for maintaining things. They’re big enough that they could take over maintenance if needed. The open source part means they can spin up the program as many times as they want, unencumbered by licenses. Compare that to non-open source tools where you can only start it once per license, or you pay for a single user. For us, this also means we get more use of our tools because we can run more concurrent tests or experiments.

Felix: And we enjoy the flexibility that gives us. If something is not working, we can look into it and fix it quickly ourselves. It makes our life easier in many situations.

How does this affect your projects?

Christiaan: Most of our projects are quite risky and unpredictable, because most of the things we do have never been done before. That’s why we get hired in the first place. This makes our kind of projects hard to estimate.

Felix: But our approach helps manage those risks. Because we can spin up way more tests and run them automatically, we explore more corner cases early. That means we know relatively early on if we might hit certain problems. If we were limited in simulation time because of license costs, we might not discover those issues until much later. Because of the dependency on that vendor or license, you would test less to save money. Which makes no sense, really.

Christiaan: We can run many tests when we make changes to our code. We can run tests overnight. We’re unencumbered in spinning up as many simulations and tests as we want. A limited license pool would potentially slow down the development cycle. The way of working makes it possible to stay mostly on track in terms of meeting deadlines.

Plus, clients can easily replicate our setup. If we used proprietary tools, they would have to talk to distributors or vendors to get a license. They’d have to involve their purchasing department and budgeting. Things that are free and open source don’t show up in the budget.

Felix: That’s a major point. No red tape for researchers on the client side. They can just download the tools and get started.

What about when things go wrong with the tools themselves?

Felix: That’s where open source really shines. I remember debugging for two weeks on one issue where it turned out a proprietary tool wasn’t supporting something properly. There wasn’t even an official way to communicate that issue to the vendor without signing a support contract. Two weeks of lost time just because we couldn’t look under the hood!

Christiaan: With proprietary tools, you can encounter situations where FPGA vendor proprietary tools become a usability nightmare. You’re at the mercy of their support team, their documentation, their release schedule. With open source, if something isn’t working, we can look into it and fix it ourselves. And that matters when you’re building systems that need to operate for decades. Medical devices, aerospace systems, infrastructure – these can’t be dependent on whether a specific vendor stays in business or maintains support for a particular product line.

Felix: There’s another aspect: research organizations and universities don’t want to deal with license restrictions. They want their students and researchers to be able to just get started without red tape. Then, students get familiar with these open source tools and platforms they already use – like GitHub. Then when they come into companies like QBayLogic, there’s a natural push from young people to use more of these tools. New startups come up using this approach. The trend is evolving, startups are using more and more open source tooling and we’re starting to see big companies try to adapt.

AI: Separating Signal from Noise

Let's talk about AI. It seems impossible to have any tech conversation without it coming up. What's your perspective on the current AI wave?

Felix: It’s really crazy. At the moment, the AI hype basically distorts everything – everything is seen through AI glasses. It’s hard to see the actual development underneath.

What do you mean by "distorts everything"?

Felix: It’s almost impossible to get funding for anything without ‘AI’ in the title. That’s the problem. If you try to do something different at the moment, it’s very difficult to get money for it. The AI hype is where the funding flows.

But you do use AI in your work, right?

Christiaan: Yes, we’ve started to use AI tools for our work. At least I have. And there are parts where it works and parts where it doesn’t.

Where does it help?

Christiaan: One of the surprising things is that it works well for the requirements phase of a project. If you give your system requirements document a section called “Open questions to be determined,” the algorithm actually puts stuff there that forces you to think about ambiguities or vague things up front. Instead of leaving them to implementation, you’re catching these issues early.

So, it's helping with the review process?

Christiaan: Exactly. The AI helps with that review aspect. It makes it easier to come up with a design that catches corner cases instead of being halfway through an implementation and then discovering an ambiguity or vagueness in the specification. When you have been working on something for a long time, you might not recognize that there’s an important detail missing. The AI can help identify those gaps.

What about AI for actual code development?

Christiaan: It’s good for stuff that you might want to throw away anyway. For example, we were working with a client where we were developing FPGA firmware and they were developing the software. Our firmware was done and there was a communication protocol that had to be initiated from the software side, but that wasn’t done yet. I could just give the AI tools the specification for the protocol and ask it to create a Python script to test this part of the firmware. Five minutes later, I had it. Otherwise, I might have had to spend a day trying to figure out how to set up all these things. Within five minutes I could start solving my actual problem instead of spending a whole day just typing. I picked Python specifically because there’s a huge corpus of publicly available code that the AI was trained on.

Felix: That’s the key – where you only need the insight and not the code itself. It shortens those steps, makes it faster, and then you can focus more on the central path.

What about AI-generated code that you keep and maintain? Have you tried using it for Haskell?

Christiaan: That’s more problematic. For the actual code that becomes part of your system, I haven’t found the AI tools as helpful. I’ve used Gemini for requirements work – that’s worked quite well in my experience. For Haskell, there’s not a lot of training data. For VHDL there’s more available on the internet, so it works a little better. But still…

What about other applications of AI in chip design?

Felix: I think there’s a lot of potential in the technical solving that happens at the ‘bottom’ of the design process. In an FPGA, once you have your design, it needs to be synthesized, placed, and routed. There’s a lot of heuristics people have handcrafted in the past. I can see AI helping there, similar to how it helped solve protein folding. AI-generated designs could help with optimization – performance improvements or resource usage improvements.

Christiaan: Though it’s worth noting that those are things you could also achieve with traditional development. It’s not like AI is unlocking something impossible – it’s potentially just faster.

Felix: Right. This would mostly boost performance or make optimizations easier to apply. But this isn’t happening yet in any significant way. I’ve seen people try it, but I haven’t seen any big success stories yet.

What about AI for formal verification and proofs?

Felix: That’s an interesting question. For day-to-day verification work, if AI could provide an exhaustive verification and give you a certificate that you can verify – how it achieves that doesn’t really matter to me, as long as I have that 100% verifiability in the end.

Christiaan: But there’s the question of whether proofs need to be maintainable in the same way that code does. With code, there’s lots of refactoring, reading, and rewriting involved. Is that the same with proofs?

Felix: The maintenance burden on proofs is way smaller. They mainly need to be updated when standard libraries change. The key thing is repeatability – if I rerun the AI and it can’t find a proof the next time, that’s a problem. But if the AI somehow encodes it into the source code and then you don’t need it anymore, that could work.

Other Hypes: What’s Real and What’s Not

Beyond AI, what other technology trends are you watching?

Felix: Well, blockchain – that hype is over. There’s not much happening with that any more. But there are other trends. Cybersecurity is definitely important. We will start to see more innovations, for example in post-quantum secure cryptography – a lot of funding is flowing into that direction already. Then there’s securing infrastructure, that’s also a focus area of innovation with a lot of attention going to hardening hardware and securing infrastructure at that fundamental level.

What about quantum computing itself?

Felix: On the chip market, there’s a lot happening with quantum chips. But it’s still too early to see where that’s going. We keep an eye on it because it’s a potential disruptor, but I couldn’t tell you where that is going yet.

Christiaan: These could be market breakers at some point, but the technology isn’t there yet for practical applications in what we do.

What about photonics?

Christiaan: Photonics is interesting, but the applications are quite different than what we use normal microchips for. If you compare a photonic structure to a transistor, it’s almost 100 times as big. An Nvidia GPU has something like 4 billion transistors. If you try to put the photonics equivalent in the same space, you might get to 500 structures.

So, you can't do massive calculations the same way?

Christiaan: Not in the way we’re used to, no. At least not yet. It’s a different paradigm.

Felix: Most of these developments are from material science and don’t directly affect our work yet. They are, however, changing the whole paradigm of what ‘computing’ is, so they could disrupt the FPGA market at some point, but at the moment it’s still too early to tell.

On a more general level, is the world of hardware design and research changing? And if so, how?

Felix: I think the current wave of AI development for example has a very practical focus. Because of this focus on applications, it leaves many theoretical questions unanswered. Research in general is definitely more practically focused these days than it used to be. A lot of theory and open questions from the past decade are being pushed aside.

Is that good or bad?

Felix: It’s a mixed thing. On one hand, it’s nice because it brings industry and research closer together. People have a more practical view. On the other hand, you can see in the funding programs that there’s more focus on building connections than on fundamental research. More startup funding, incubators, organizations trying to bring inventions together and focusing on applications rather than fundamental discovery.

Building a Team for This Approach

With your distinctive working style and focus on open source, how do you approach building your team?

Christiaan: It’s interesting – people talk about a shortage of engineers in our industry, but as QBayLogic we’re not experiencing any of that. Maybe that’s because of our size, but I think it’s also because of our approach.

Felix: We are looking for a bit of different people than traditional FPGA companies might be looking for.

Different how?

Christiaan: In our way of working, we can recruit people fresh out of university – somebody with zero years of experience. We have our internal training to bring people up to speed.

Felix: Exactly. So we’re not competing for the experienced talent everyone else wants to attract. We actually prefer to train people ourselves, because experienced engineers have often trained on tooling or learned a working style in the past that wouldn’t fit with what we do – the open source tooling, our whole setup. They would need to retrain anyway.

Christiaan: Our setup not widely used yet, so how would we even hire 20 people who already know how to set this stuff up?

Felix: If you know how to work with this approach, you either work here or at a handful of other places doing similar things.

Does that affect retention? Do people stay long-term?

Christiaan: Yes, people tend to stay. Once they’ve learned this approach and seen the benefits, it’s not something they can easily find elsewhere.

Would you like to share this article?