Spotlight: Dave Lemaire of DYN
Introducing a new series on the MST blog: a quarterly spotlight on a community member. Tori, MST’s Operations and Account Support, was fortunate to interview Dave Lemaire, Director of Infrastructure Engineering at Oracle/DYN, for our inaugural post.
Hi Dave! Could you tell us a little about yourself and your background?
My current role is Director of Infrastructure Engineering at DYN/Oracle. I manage teams who are responsible for delivery of platforms – hardware, OS, support services like VMS, key value storage, monitoring, metrics, and log collecting, as well as secrets management. We maintain a continuous integration pipeline to deploy virtualized systems and container technologies. I have been in tech for over twenty years, beginning as a Systems Admin. In my current role, I get to use technology that is a little more advanced than your typical IT shop.
How important is it for you to stay current with all of the new tech that hits the market?
New technology is generally created to solve a problem, often a developer’s challenge or something in the way of their effectiveness. Any developer’s goal is to deploy their code in the quickest and most effective way possible, because it helps their company deliver value to their customer faster. Another reason for me (and other leaders in the industry) to keep up with new technology is employee attraction and retention. Developers want to use the newest tools and be able to exercise their skills to the fullest extent. Keeping the tech that we use up to date is a great way to motivate and inspire employees, and show them that you value their skills and capabilities.
What piece of emerging technology do you feel will be valuable for those in the industry to learn about now?
Containerization of platforms and services is very valuable right now. This tech helps isolate services, making operational aspects and development easier. Docker is the most popular containerization tool, although it was built on tech that has existed for about 30 years. The main difference is that it has been made more user-friendly and developer-friendly than the older tech.
The practical applications are great. Containerization tech facilitates having a multi-cloud provider approach. If you can run a tool chain and deploy services over multi-cloud, there’s less vendor lock in and services can be deployed to difference services (AWS, etc.). This makes the services more transparent to both the developer and end user. There’s a lot of talk right now about “serverless tech,” and it’s become a catch-phrase in the same way “cloud” has. There’s still a server behind “serverless tech” but it’s hosted by someone else. This allows for a more discreet breakdown – instead of deploying to the system, you’re writing a piece of code to deploy to a serverless application.
What do you think is next?
Well, it’s hard to imagine what’s next because code can only get so small. The progression has been and continues to be a whittling down of what you deploy, increasing the speed at which you can deploy. We’re always looking to break code down into smaller units because it becomes easier to distribute. Think of it like personal computers – they keep getting smaller, faster, and more efficient. It’s the same with computing work. Serverless tech is still new, and I expect that to grow as it is more widely adopted. Anything that helps developers get their code out faster, and better, will be on the rise.