Self Evolving AI Agents

Self Evolving AI Agents

AI agents needs tools to be useful. If you want to build a massive and complex AI agent, you've therefore got 2 choices.

  1. Create "a bajillion" tools for every imaginable problem your AI agent might possibly face
  2. Create an AI agent that generates tools on demand

We choice the latter, and we can now create new tools on demand in about 1 to 3 seconds - And more importantly, we can generate tools for most problems an AI agent can be confronted with in theory, and we can generate such tools using natural language.

Use Cases

Really, what use cases do exist for computers? The same use cases applies for our self evolving AI agent. But below I've got some examples to illustrate the point. All of these are 100% perfectly working prompts, that you could execute today if you wish.

Website scraping

Imagine you need to scrape and crawl some website, transform the HTML into Markdown, and store into some database. Easy peasy ...

Crawl ainiro.io/sitemap.xml and convert the first 15 URLs to Markdown, and insert into database 'cms', and its table 'pages', storing the generated markdown into the 'content' column.

The above would result in the following 100% correct Hyperlambda code.

http.get:"https://ainiro.io/sitemap.xml"
xml2lambda:x:-/*/content
.count:int:0
for-each:x:@xml2lambda/*/urlset/*/url/*/loc/**/#text
   if
      lt:x:@.count
         .:int:15
      .lambda
         math.increment:x:@.count
         http.get:x:@.dp/#
         html2markdown:x:-/*/content
            url:x:@.dp/#
         data.connect:cms
            data.create
               table:pages
               values
                  content:x:@html2markdown

I know software developers having spent months trying to solve the above problem manually, without succeeding. With Magic Cloud and Hyperlambda it's a simple English sentence, resulting in code that executes 20 times faster than its Python equivalent, consumes 10% of its codebase (LOC), and simply solves the problem automatically.

If I wanted to, I could instruct it to use a token from my configuration file, at which point it could crawl and scrape password protected websites.

Email automation

Select all records from 'crm' database and its 'contacts' table. For each, send an email using the [email] and [name] column. Load the file "/etc/email.html" and apply template arguments for the '{{name}}' field on the file before using it as your body. Use "Welcome to Magic" as your subject.

The result is shown below.

data.connect:crm
   data.read
      table:contacts
      limit:-1
      columns
         email
         name
   for-each:x:@data.read/*
      .email
      set-value:x:-
         get-value:x:@.dp/#/*/email
      .name
      set-value:x:-
         get-value:x:@.dp/#/*/name
      io.file.load:/etc/email.html
      strings.replace:x:-
         .:{{name}}
         get-value:x:@.dp/#/*/name
      mail.smtp.send
         message
            to
               .
                  email:x:@.email
                  name:x:@.name
            subject:Welcome to Magic
            entity:text/html
               content:x:@strings.replace

Kaboom! 10,000 emails sent!

SEO

Imagine how much information you could retrieve from your website if you could semantically scrape it, return only H1 headers, URLs, etc. How valuable would this be for your SEO initiatives? Well, all of the below prompts actually works with the Hyperlambda generator.

  • "Return the first 50 URLs from ainiro.io's sitemap, in addition to count of how many URLs are there in total"
  • "Crawl ainiro.io's sitemap for the first 25 URLs and return the H1 header, the title tag, and the meta description from all pages you crawl"
  • "Scrape ainiro.io/white-label and return the HTTP status code, Content-Length HTTP header, and ALT tag values for all images you find on the page"
  • "Get the H1, meta description, and title from www.hubspot.com"
  • "Fetch all hyperlinks with their trimmed anchor text values from xyz.com/articles/foo, and return these"
  • "Scrape xyz.com/data/reports and return the Organization JSON-LD schema"
  • "Crawl all hyperlinks from ainiro.io and return their HTTP status codes and trimmed H1 value, in addition to how many milliseconds was needed to load the links"
  • "Return all 404 URLs from ainiro.io's sitemap"
  • "Return all dead links from ainiro.io/ai-chatbots"
  • "Crawl the first 5 URLs from ainiro.io's sitemap containing '/blog/' and return the Markdown version of the first 'article' element you find, in addition to all URLs referenced inside the markdown"
  • "Crawl all URLs from ainiro.io/sitemap.xml and return all H1 values, title values, and meta descriptin values"
  • "Fetch all external hyperlink URLs from 'https://ainiro.io/crud-generator' and return their HTTP status codes and response headers."
  • "Crawl all images on ainiro.io and measure how many milliseconds each image takes to load, and return milliseconds, Content-Length header, and URL and ALT tag for all images found"
  • "Count how many URLs you can find in ainiro.io's sitemap"
  • "Scrape ainiro.io/blog/whatever-article and return its JSON-LD schema"

Wrapping up

It's not perfect. It messes up every now and then, and I am constantly eliminating "issues" with it. But if you run it through the dashboard, the AI agent will for the most parts help you create prompts that are known to work, for whatever thing you're trying to do. Notice though, Hyperlambda is a DSL and not a complete programming language though, so there are a lot of things a normal programming language can do that Hyperlambda cannot do. Yet still, if you know its capabilities and quirks, I'd argue it's a highly useful thing, allowing you to generate AI agent tools on the fly, as needed.

Psst, try it out at our natural language API page, or watch me demonstrate the thing in the following video.

Thomas Hansen

Thomas Hansen

I am the CEO and Founder of AINIRO.IO, Ltd. I am a software developer with more than 25 years of experience. I write about Machine Learning, AI, and how to help organizations adopt said technologies. You can follow me on LinkedIn if you want to read more of what I write.

This article was published 5. Jan 2026

The best Web Scraper in the Industry

With our recent additions to our Hyperlambda generator, it is safe to assume that we've got the by far best commercially available web scraper in the world.

Hyperlambda, a Web Query Language

With our Hyperlambda Generator you can treat the web as an API, querying it using natural language, and return structured JSON

Magic Cloud has 60x the Performance of Lovable

I just measure the performance of Lovable versus Magic Cloud, and Magic can do in 3 seconds what Lovable needs 3 minutes to accomplish.

Copyright © 2023 - 2025 AINIRO.IO Ltd