Hack Week 2025: How these engineers liquid-cooled a GPU server

// By Catie Keck • Aug 27, 2025

Hack Week 2025 at Dropbox centered on the theme “Keep It Simple,” offering opportunities for innovation, experimentation, and finding smart solutions to complex challenges. With in-person hubs in San Francisco, Seattle, and Warsaw—as well as the option to hack virtually—the July event brought together Dropbox developers to explore new ideas and build projects that could shape future products and workflows for tools like Dropbox Dash.

One standout effort, “Liquid Cooling CPU/GPU Hardware,” earned the Learn Fast award for accelerating learning and innovation. The team—Bobby Woolweaver, Daniel Coultas, Eddie del Rio, Eric Shobe, and Daniel Parker-Focht—designed a custom liquid cooling system for high-powered GPU servers to tackle the rising thermal demands of AI workloads. They built a lab setup, tested core components, and demonstrated significant benefits: 20–30°C lower operating temperatures under stress, quieter performance than air cooling, and the potential for power savings and environmental benefits.

Forward-looking in scope, the project explores next-generation GPU servers that may require liquid cooling due to increases in power consumption and heat generation. The team plans to expand testing with more liquid cooling labs in multiple data centers. We sat down with systems engineer Bobby Woolweaver and data center engineer Daniel Coultas to discuss their award-winning project and what it could mean for the future of infrastructure at Dropbox.

Dropbox Dash: Find anything. Protect everything.

Find, organize, and protect your work with Dropbox Dash. Now with advanced search for video and images—plus generative AI capabilities across even more connected apps.

See what's new →

Your experimental lab took home our Learn Fast award this year. Walk us through what you built and how.
Daniel:
For the Hack Week project, we built our own liquid cooling system from scratch. Normally, these systems come pre-assembled with pumps, radiators, and fans, and you just plug them in. But since we had trouble sourcing a complete system in time, we decided to put one together ourselves. We used scaled-down versions of the same core components you’d see in a data center liquid cooling setup: radiators to exhaust heat, fans, a pump, a reservoir, tubing, manifolds, and some basic sensors. The sensors were key so we could monitor performance and make sure everything was pumping correctly before we connected any expensive GPUs. Once that was in place, we hooked it up to the server itself.

What thermal performance observations did you make while working on this project?
Bobby:
In terms of immediate thermal benefits, we saw a big difference. When running workloads on the liquid-cooled setup compared to our current air-cooled production system, temperatures were around 20–30°C lower under heavy stress tests. Though these were torture-style tests—even harsher than what we’d normally see in production.

Another key part of Hack Week was having the dedicated time to experiment with fan configurations. Since liquid cooling handled the CPUs and GPUs, we were able to remove or run many fans at lower speeds. We still needed some airflow for other components like DIMMs and the network card at the back, but those draw much less power and run at lower thermals compared to the GPUs and CPUs. Daniel even suggested building a specific airflow baffle to direct cooling to exactly where it’s needed.

The liquid cooling team’s lab at Hack Week 2025

Liquid cooling has been around for years. What first sparked your interest in exploring it as a potential solution for Dropbox?
Daniel:
Liquid cooling has been around for a while, and the industry has been actively experimenting with it. We’ve followed the technology closely, including by attending conventions like the Open Compute Project summit where it’s a big topic. Bobby and I have seen these setups before and thought, this is really interesting—how could we apply it to Dropbox? We’ve had that question in the back of our minds for years now, and now we’re finally turning it into something concrete.

Bobby: Right. But it’s not as simple as just plugging in a liquid-cooled server. We need the right infrastructure in place so that if future high-performance servers require it, we’ll be ready. This project was about building that foundation.

So the challenge you’re solving for is future-focused—preparing for next-gen hardware and higher power needs?
Daniel:
Exactly. It’s about handling both individual server power draw and the overall data center footprint. As new servers demand more power, sticking with only air cooling would force us to spread them out over more space. With liquid cooling, we can stay efficient—using less space, less energy, and potentially lowering costs.

How might this technology fit into our current and future infrastructure strategy, particularly with respect to our focus on supporting AI workloads?
Bobby:
We’re seeing a greater need today for new solutions, especially with GPUs and AI workloads. These systems draw a huge amount of power and generate significant heat. While vendors aren’t yet requiring liquid cooling for their top-tier GPUs, we know it’s on the horizon. Air cooling may soon only support mid-range options.

Daniel: And with Dropbox focusing more on AI initiatives, it gave us the push we needed. As we expand into GPU-heavy systems, it’s important to evaluate higher-powered setups. Hack Week was the perfect opportunity to explore that.

The prototype liquid-cooled GPU server

Hack Week is a self-driven initiative where engineers are encouraged to explore projects independently. What resourcing or support were you given to explore this project?
Daniel:
Dropbox has always made it possible for us to experiment. I’ve always felt supported to try new ideas. Our team was really interested in liquid cooling, and since Bobby’s team shared that interest, we were able to secure some funding to kick the project off. It gave us the chance to dive in ourselves and really have fun with it. Of course, we still have to balance it with our regular work, but we’ve been empowered to make the time and space to do that.

Bobby: I’d say the same. Both of our teams strongly believe this is an area we need to invest in—to research, lay the groundwork, and be ready for what’s coming. Once we showed that, we received support all the way up to move forward. So it’s been great to have that backing and be able to push ahead.

In-person events like Hack Week are an important part of the Virtual First experience at Dropbox. You both had the chance to attend in person for the first time this year. What did you enjoy about the experience? What were some of the benefits of working together in a physical space?
Bobby:
I enjoyed getting to hack with the team and connect with people across the company that I don’t usually see. Being in the same room made it easy to bounce ideas off each other and solve issues quickly. For us in physical infrastructure, we usually kick off projects or bring-ups on site at a data center so we can quickly work through challenges and issues that are a normal part of any new project.

Daniel: My experience is very similar to Bobby’s. Being in person, we have easy access to the minds of our peers. We can bounce ideas off them, pick up workflow improvements, and problem-solve very quickly.

Additional contributors to this project include Eric Shobe, Eddie del Rio, and Daniel Parker-Focht.

~ ~ ~ 

If building innovative products, experiences, and infrastructure excites you, come build the future with us! Visit jobs.dropbox.com to see our open roles, and follow @LifeInsideDropbox on Instagram and Facebook to see what it's like to create a more enlightened way of working. 


// Copy link