AirOps Academy
Knowledge Bases

Working with Website Data

Lesson Overview

In this video, you'll learn how to easily bring context from your existing website into your AirOps workflows using the Web scrape feature. By importing pages or entire sitemaps, you can add valuable data to your knowledge bases and create differentiated outputs with rich context.

  • 0:00: Introduction to bringing website context into AirOps workflows
  • 0:17: Using the Web scrape feature to import single pages or entire sitemaps
  • 1:03: Demonstrating a simple workflow using scraped website data
  • 2:53: Potential use cases for working with data fetched from the web

Key Concepts

Web Scrape Feature

The Web scrape feature in AirOps allows you to import data from your existing website into your knowledge bases. You can choose to import a single page or an entire sitemap, typically found at your domain followed by "/sitemap.xml". This feature enables you to bring valuable context from your website into your AirOps workflows easily.

Knowledge Base Search Step

The Knowledge Base Search step is crucial in retrieving relevant information from your scraped website data. By selecting the desired knowledge base, specifying the number of results to return, and passing in a search phrase, AirOps will return the most semantically similar chunks of text from the knowledge base. This step allows you to fetch relevant subsections of your website data and use them in your workflows.

Integrating Scraped Data with LLMs

Once you have retrieved relevant data from your website using the Knowledge Base Search step, you can integrate it with Large Language Models (LLMs) to create differentiated outputs. By passing the scraped data into an LLM prompt, you can generate content that incorporates rich context from your website. This technique enables you to maintain consistency with previously published content, pull in customer quotes, reference accurate product information, and more.

Key Takeaways

  1. The Web scrape feature in AirOps simplifies the process of bringing context from your existing website into your workflows, allowing you to import single pages or entire sitemaps.
  2. The Knowledge Base Search step is essential for retrieving relevant subsections of your scraped website data, which can then be used in your workflows.
  3. By integrating scraped website data with LLMs, you can create differentiated outputs that incorporate rich context, maintain consistency with previous content, and reference accurate information.
  4. Potential use cases for working with data fetched from the web include internal linking, pulling social proof into articles, referencing product information, and ensuring consistency with previously published themes.
  5. This powerful technique, demonstrated in a simple way in the video, opens up a wide range of possibilities for enhancing your content and workflows using data from your existing website.

Knowledge Bases

Now that you're able to build, let's enhance your workflows with your own content, data and brand voice.

Search

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
No results found