OpenShift AI custom workbenches
This week I’ve learned about custom notebooks in OpenShift AI. The use is quite similar to Dev Spaces, at the end it’s just an image you’ll have to build and provide somewhere.
Here’s a useful link on how to get easily started and get it done.
OpenShift AI and it’s upstream project OpenDataHub have a image repo with a lot of pre-built images you can use as a base for your own implementation, here’s the link. If you already know how to use custom Devfiles in Dev Spaces, that should sound familiar to you because there you probably use one of the universal developer images as a base.
The main use cases are if you would like to
- use R Studio instead of the other out-of-the-box supported workbenches or
- if you would like to have additional dependencies installed and don’t want to install it manually after each start.
For more info, please check the guide.
Ai energy consumption
In a discussion with a colleague, we explored the question of whether it’s generally advisable to use LLMs for everyday questions or whether traditional Google searches are preferable. He pointed me to the following blog posts, and I don’t want to withhold it from you. It’s a good read, and I definitely recommend it.
- https://engineeringprompts.substack.com/p/ai-energy-use
- https://engineeringprompts.substack.com/p/does-chatgpt-use-10x-more-energy
The blog posts generally analyze how much energy is consumed and compares it to a traditional Google search. TL;DR: AI queries probably cost 10x more, although the boundaries are blurring now that Google also includes AI functionality.
OpenShift and AI
In a conversation with my manager, he asked me if I thought knowledge of OpenShift and knowledge of AI were separable.
He wanted to point out that the role of a Specialist Solution Architect for the OpenShift App Platform (as I am) also requires AI knowledge (even if only partial knowledge) to perform the job satisfactorily, or whether one could perform the role completely without AI knowledge.
My answer was that you can’t get away from AI these days anyway.
Let’s take Konveyor, the project that supports migrations to Quarkus, for example, and will release Konveyor AI at the end of the year, which will enable this work to be carried out with AI support. To be able to explain Konveyor AI confidently, you need at least rudimentary knowledge of the GenAI area; otherwise, you’re already in trouble with the first question.
I haven’t even mentioned OpenShift Lightspeed, InstructLab, vLLM, and many other super exciting projects that have recently landed at Red Hat. But you simply can’t get away from them these days.
And it’s better to embrace the new than to ignore it.