The scoring of the file scores.json are based on the following criteria. Before we start I want you to notice a python script sitescore.py that is a very basic implementation of accesing this database.:
The website should not restrict from what IP address the user is connecting to it. If a user wants to use VPN or Tor to keep his identity private, he should be able to do it. Sites that break functionality, or refuse to load over Tor will not get this point.
This point will get a site that has it's sources readable. Meaning, even if they use some kind of web compiler to create the HTML and JavaScript code. It should be still easy enough to understand. Especially JavaScript.
JavaScript is software. And this software should be Free. So a user could have the four essential freedoms with the JavaScript. There are programs like LibreJS that check each JavaScript file for a license. But obviously if the file is compiled and mumbled, there is not license section on it. So at least the source from which the JavaScript is compiled should be Free Software. Similar to Free Software compiled to a binary executable.
Some browsers do not support JavaScript. Sometimes a user might disable all JavaScript from running. For example the Tor Browser has a function to disable all JavaScript if you select the highest security setting. The website should not break it's core functionality if the JavaScript is off. Maybe the developers could make a separate HTML5 version of the site that perhaps looks worse. But still gives the users the core features this way.
Websites should not collect data. A website that sells physical things, might know your credit card number and your address, so they could charge you and send you the things that you buy. This is okay. But a search engine doesn't need to know those things to operate. So it should not collect the data in the first place. Sometimes analytics about usage of the website is beneficial to it's developers. In this case, they should ask for permission to collect this. Giving the user an option to opt out of this data collection at any moment. The website should not break it's core functionality if the user doesn't want data unrelated to this functionality to be collected.
If a website, like the one I mentioned, that sells you physical things, collects your data to send you the things that you buy. As soon as this transaction is over. This data should be removed from their servers. They should not keep it any longer. They could keep the data about what item was bought and in what amount. But not the credit card number and the address of the person that bought it. This has nothing to do with their book keeping. So they should erase it as soon as possible. And even with book keeping. It's beneficial, when the season is over and the record is not needed anymore, the record should be erased.
To make the web service one extra step more Free. The source code of the server should be available. And should be Free Software too. Since if a person doesn't like how the site operates, but likes most of it's functions. This person could make his own, similar site, editing out the nasty bits.
If a website is more complex then simply storing HTML documents, it must have an API to access it's stuff. Especially if the website is a web application. So people would be able to write their own clients for this website. For example LBRY protocol of Odysee could be considered a Free API. The API also should not require sacrifices. So everybody could build software using it, and no accounts, or paywalls should be there to restrict it.