How we built a 1% website in 3 days for €7

How we built a 1% website in 3 days for €7

Due to events outside of our control, Aria, Tage and me were forced into a situation where we had to launch a successful and profitable ChatGPT-based website chatbot company in 2 weeks. The stakes are super high, but we'll get there with some help from our friends. Read more about our ChatGPT-based website chatbot solution here.

Failure is not an option though, so we're confident in that we'll make it - Especially considering all of our amazing friends who have been there for us these last days and supported us. Thank you, it means the world to us. The words "teamwork and friendship" have created meaning for us these last days in ways most people unfortunately never will experience during their lifetimes.

To Infinity and Beyond!

What is a 1% website?

Anyways, so what is a 1% website? It's a website that performs in the top 1%. Implying if you pick 100 random websites in the world, and include ours, ours will be the best performing website on neutral metrics. It's probably not a scientific definition, and I cannot prove that we're in the 1% - But I'm confident enough to publicly state it anyways. So, let me explain how we did it here for those interested.


I have a hate/love relationship to WordPress. For non-technical people, it's an amazing tool. However, for a software developer having spent 25 years working with web related technologies it's like attaching an open parachute to a racing car. The markup it generates is far from optimal, performs like $%^&, and adding Elementor only increases the damage. WordPress, especially in combination with Elementor, becomes a "tag soup from hell". The same is true to some extent with more or less all CMS systems out there. Yet again, if you're not a software developer, use WordPress - It's an amazing no-code and low-code tool if you need a web presence. For us though, it's simply not "relevant", so we chose to build our website entirely on top of Magic Cloud and Hyperlambda.

This means that every single HTML tag was manually added using Visual Studio Code, something you can appreciate if you click "View Page Source" on this page. This gives us 100% control over what's rendered to the browser, resulting in 1,000x quality on the site's HTML. Since Hyperlambda websites allows us to create "reusable components" the end code is still 100% perfectly "DRY" (Don't Repeat Yourself).

Basics of a Hyperlambda website

A Hyperlambda website allows you to declare codebehind files in Hyperlambda. If you've got an HTML file called "foo.html" for instance, you can create a Hyperlambda file in the same folder called "foo.hl", and it will serve as a codebehind file, allowing you to dynamically inject lambda expressions into your HTML, that are substituted with the return value of the lambda expression's result. You can read more about Hyperlambda websites here. Scroll to the bottom to find the relevant parts.

Hyperlambda websites gives us 100% control over the HTML, in addition to providing us with all the sane defaults out of the box, allowing us to override things where needed. With "sane defaults", I mean things such as HTTP caching of static resources, etc.

The blog parts of our site is similar to Jekyll, implying blogs are simply Markdown files, except the resolver transforming the Markdown into HTML is (pun!) Hyperlambda and not Jekyll. If you don't like Hyperlambda, Jekyll is an amazing alternative.

In addition to the above, we've got a private GitHub repository, with a CI/CD pipeline implemented as a GitHub action. Implying each time I push towards the master branch of our website at GitHub, a new version is automatically deployed. To illustrate that fact, realise I'm actually using VSCode to write this blog to you ... ;)

Web Frameworks

We did not start out with a template at all. In fact, the website is manually designed from the bottom and up, and the only "framework" we're using here is Bootstrap. Besides from bootstrap, every single line of code, CSS, and HTML is manually written, for the most parts in vanilla JavaScript may I add. All animations are pure CSS animations, yet again manually written, and even the contact form's HTTP post request was manually created using JavaScript's fetch function. I've included it for references below if you're interested in the details.

  function sendEmail(email, name, content, recaptcha_response) {
    fetch('', {
      method: 'POST',
      headers: {
        'Accept': 'application/json',
        'Content-Type': 'application/json'
      body: JSON.stringify({
      .then(response => response.json())
      .then(response => {
        if (!response?.result === 'success') {
          throw new Error("Bad request")
      .then(() => {
        const contact_us = document.getElementById('contact_us');
        contact_us.className = "row fade-out";
        const thank_you = document.getElementById('thank_you');
        thank_you.className = 'row fade-in thank_you';
      }).catch((err) => {
        alert('We could not send your email :/');

I admit it's a little bit simple, but it gets the job done. Submitting the form looks as follows, to make sure we have reCAPTCHA support (version 3).

  function onSubmit(e) {
    grecaptcha.ready(function () {
        .execute('YOUR_RECAPTCHA_TOKEN_HERE', { action: 'submit' })
        .then(function (token) {
          const msg = 'Website; ' + document.getElementById('website').value + '\r\n\r\nMessage;\r\n' + document.getElementById('info').value;
    return false;

And that's about all JavaScript that's in our page. If you need a "contact us" form, you're of course free to use the above code.

ChatGPT chatbot

Of course, the seasoned software developer will now object and ask "how did you get the chatbot working?", at which point I can answer we're obviously using our own AI chatbot. It's just a simple JavaScript inclusion tag, and an integrated part of Magic Cloud. So there was zero work from my end to have a functioning ChatGPT-based chatbot on the site. I suspect there's a "not invented here syndrome lesson" in there some place, but I'll leave it up to the reader to find it ...

SEO metrics

If you're not in Google's SERP, your website might as well not exist. The way we measure SEO is by using seobility. This is an amazing tool that allows you to measure how your site works on metrics such as keywords, titles, descriptions, etc. Run your site through seobility, and 5 minutes later it'll tell you everything you're doing wrong, and provide hints as to how to improve your site. Below is how our site are performing according to seobility.

How AINIRO isperforming on SEO

I need to emphasize here, that I have spent months on optimizing WordPress websites previously without even being able to achieve the above score. The reasons are because of "too much Magic stuff" going on in WordPress, supposedly making "things simple", resulting in less control, and more noise HTML, and less responsive sites. I admit the site isn't perfect, but a score of 84% easily outperforms 99% of all websites out there. And importantly, it was built in 3 days, and we'll continue working on it in the future to optimize it.

Page speed

This is "complicated". Our site by itself performs amazing. However, Google's reCAPTCHA JavaScript library unfortunately is "sub optimal" to be polite. If anybody there have advise on alternatives, I would love to hear about them in the comments. The irony is that Google is using page speed metrics to measure "quality of websites", resulting in improving your SERP positioning - For then to create a CAPTCHA library that blocks for 3 seconds, and downloads itself multiple times. I am not the first one to pinpoint out this simple fact. Unfortunately we must have CAPTCHA to prevent bots from sending us emails, so it's a bit of a catch 22 for us here.

I have even tried optimizing reCAPTCHA's JavaScript libraries manually, something you can see if you inspect the JavaScript on our landing page - But completely getting rid of the problem is simply not possible, or at least I don't know how. 98% of the penalty we're getting from Page Speed Insights originates from reCAPTCHA and Google Tag Manager. The irony ... :/

Still the site loads on my iPhone without WiFi on 5G in less than 3 seconds. Let us know how it loads on your phone in the comments. Without reCAPTCHA and Google Tag Manager it would have loaded in 0.5 seconds. Finally we registered a CloudFlare account, allowing us to significantly improve page loading by having an amazing CDN offering. We're still in the free version, so we don't have edge caching of documents - But in a week or two we'll probably have implemented this too.


Anyways, that's how we created a 1% website in 3 days, outperforming 99% of all sites out there on neutral metrics, with a budget of €7. Give us 3 months though, and it'll probably be in the 0.1% range.

As a final note, I have to give credits where credits are due, and the graphics was created entirely by Tage, whom I think did an amazing job not only creating great graphics, but also reducing the size of images significantly, which obviously is important when it comes to SEO and page speed metrics. Then of course, Aria helped out a lot, QA testing, helping out with words and such, and then as a final touch DALL-E created the images that Tage didn't create.

Using the following technologies we were able to create a 1% website in 3 days spending no more than €7 per month.

  • Hyperlambda and Magic Cloud
  • DALL-E
  • SEOBility
  • Page Speed Insights
  • CloudFlare
  • Visual Studio Code
  • Adobe Illustrator (Tage's tool)

Plus finally, €7 per month on servers from DigitalOcean.


7 EUROs per month, and 3 days worth of work, by 3 people, and we've got a "1% website".

Thomas Hansen

Thomas Hansen I am the CEO and Founder of AINIRO.IO, Ltd. I am a software developer with more than 25 years of experience. I write about Machine Learning, AI, and how to help organizations adopt said technologies. You can follow me on LinkedIn if you want to read more of what I write.

Published 14. Apr 2023