I came across a few articles on how to get a blog post on the first page of Google and started to wonder how much effort and money it takes to build a website with a new leading domain on search engines and how long it was taking him.
Choose to build a website with automated information, such as free data provided by API, so you don’t have to fuel the website by writing a new post every week or even every day. The goal is to make a little effort, increase domain authority and be on the front page.
I never thought of making money out of it for this reason, after proving my point I left it there and I may delete it in the future.
Step 1: Get a domain name
The first thing I did, and this is a very important step, was to purchase a domain for my experience. The domain you choose to buy should be as “optimized” as possible to “look good” on search engines. By “looking good” I mean you have to choose a domain that doesn’t look fake, spam, etc.
Try to find a domain that does not contain dashes and underscores, an easily readable domain, and, if possible, a “dot com”. You are now asking yourself “such a domain must be very expensive”. If you dig enough, you will find an area that is right for you and that is inexpensive to meet your needs.
I found a domain reseller offering “.com” and others for free for a limited time, then I bought my domain for a total of $ 0.
Step 2: What kind of information will you provide on your website?
As I said above, the website data came from free APIs, so I didn’t have to feed the website every day. I started looking for free data that I could use for free and I had an API that provided access to the data.
I first found the News API and as I was fascinated by space and technology I started digging and looking in the API for what I could use. I decided to use the Newsdata.io news API just because I love the content and also had a new post every day.
I requested an API key and the next minute I had an API key that gave me access to the whole breaking news database. Google Trends and other big news publishing sites have an RSS with search trends for some countries, but I don’t have to do this myself as we have a news API for this purpose.
Created a python script that retrieves news articles with the news API and the script retrieves only news articles related to the science and technology category. Newsdata.io news API, which allows you to make 200 requests per day for free. Again it cost me the total sum of $ 0 to put together all the information I needed.
Step 3: Static “dynamic” website
Instead of creating a normal dynamic website, I created a “static-dynamic website” (what!).
But what is a “static-dynamic website”?
I don’t think the expression exists at all. But what I wanted was to create a static website through a static site builder and build it whenever the information changes.
To collect the data I used my local NAS which allows virtualization, but it can be any computer (raspPI, PC, etc.). This server was responsible for collecting the data, creating the website, and posting to notify.
Step 4: Traffic
The best possible traffic for your website is organic traffic (the one that comes from google searches) but to do this, in addition to ensuring good SEO practices, you must first increase your domain authority.
To do this you have to be patient as it may take some time, but there are a few “tips” you can do to speed up the process.
This is the part I’m least proud of. To increase your domain authority, you can use traffic exchange (like rankboostup) to increase your traffic. It provides quick results but I don’t recommend overdoing it. Other ways to help are by sharing links, articles, posts in online communities to start building backlinks.
Domain — 0$
Hosting — 0$
Data — 0$
Rankboostup — 0$
As you can see I spent almost $ 0 to get to the first page of Google. I say close because I use my time and my personal server which comes with a little electricity and other expenses but nothing substantial.
This post is getting longer than I like so I will do part 2 which will contain the technical explanations and describe the stack (lumen, SQLite, Jigsaw, Netlify, Debian) used in this experiment. It will also contain code snippets and examples of how I gathered information on how I created a “staticdynamic” website.
Thanks for reading and stay tuned for more information.