Non JS Browsers Don’t Count in GA
While the volume of real users with JS turned off browsing your site may be as low as 1% the volume of bots visiting your site may be large indeed. Bots such as Google, Bing etc do not run JS but do dowload your website’s pages.
It’s All About the Bots
The proportion of bot traffic on your website is a function of the number of pages your site presents to the bots for crawling versus the volume of normal traffic. For sites with volumes of real users in the hundreds of thousands per month and where the sites have many pages for a bot to crawl (such as ecommerce product pages) bot traffic may be greater than 50-75% of traffic on your website. This can be a shocking statistic to see for the first time.
For example, here are the to 20 user agents browsing a particular website which is serving circa 60,000 pages a day (including bot traffic). The obvious bot browsers have been noted. The top 2 browsers and 6 of the top 10 are bots.
Sites with larger volumes of normal traffic (over the 2m pages per month) do not usually see bots as a significant proportion of traffic except where the catalog of pages presented to the bots is unusually large.
Serve Bots Faster
Serving bots faster, more reliable pages is a good thing. While you may choose to reduce the rate with which the bots crawl your website, you cannot change the frequency (see for example Google Support). You could also mark pages on your site for the bots to not crawl. However, serving faster responses to search bots (for all content types) is advantageous for the site SEO ranking. See Google’s reference to this.
Why is a Bot Page View the Same as Normal User?
Bots make your site serve pages in the same way as a real browser. They request the HTML and the assets associated with the HTML in the same way as a real user so from your webserver’s perspective and section.io, a bot requires the same level as service as a real user.