From the course: The AI-Driven Cybersecurity Analyst

Unlock this course with a free trial

Join today to access over 24,900 courses taught by industry experts.

Running AI offline and locally

Running AI offline and locally

- [Instructor] By now, I hope I've convinced you that AI can be a powerful copilot for cybersecurity analysts, but maybe you're concerned about data privacy, or perhaps your company's generative AI policy restricts tools like ChatGPT. If that's the case, running AI locally might be the answer. In this video, I'll walk you through what local AI tools are and why they matter, open-source LLMs worth exploring, and how to optimize an offline AI model for your needs. Let's dive in. So, what's a local AI tool? A local AI tool lets you harness the power of AI without sending data to a third party. When you use tools like ChatGPT, you're interacting with cloud-based AI models over the internet. While convenient, this raises concerns about privacy and security, but there's good news. You don't have to rely on online or paid AI tools as a cybersecurity analyst. It's surprisingly easy to download a pre-trained LLM onto your laptop, a private server, or even a secure cloud instance. Platforms…

Contents