A full web site starts with a server, this page provides an overview of the options.
- Terminology: What is a server?
- Bare Metal
- Sharing Pizza Boxes
- Costing and Pricing: What choices?
Terminology: What is a server?
The wikipedia page on ‘server‘ makes the ambiguity clear:
In computing, a server is a computer program or a device (computer) that provides functionality for other programs or devices, called “clients”
In fact both the software or ‘the device’ (computer) are required required to make a server. ‘The device’ can be any laptop computer or desktop computer with a network connection, and even a small computer such as raspberry pi. However, that the word ‘server’ can mean ‘the device’ and can also mean ‘a computer program’ results in use of the word ‘server’ by itself being ambigous.
The ‘server device'(computer) must run ‘server software’ to work as a server, and ‘server software’ requires a suitable device so both device and software are always there. The software, and the device are partners – but confusingly ‘server’ can be used to mean either. So whenever ‘server’ is mentioned, it requires to context to determine if the term is being used for software, or a device.
Further, while that definition from wikipedia is quite accurate, ‘server’ is also quite often an abreviation for ‘web sever’ – which is a server that provides a very specific
functionality for other programs or devices. The goal is for the context to make clear, does ‘server’ mean ‘web server’, and also does it mean the program, or the device. On this page, ‘program’ or ‘software’ will be used for software where it is thought there could otherwise be ambiguity, and ‘device’ or ‘computer’ to avoid ambigutity for the device.
Bare Metal – DIY
Bare metal is simply an individual computer, running, the required server software. Do it yourself is you put the software on a computer and look after the computer and software. You can run a server on any desktop, or even laptop computer that is connected to the internet. In fact, if you have a team of people on call 24hrs a day to ensure the computer keeps running, have adequate knowledge of firewalls, have multiple redundant connections to the internet and an uninteruptable power supply with a back up generator then this may be the option for you website. For the rest of us, this option is for the test version of the website, which the outside world does not visit. Since most of us will find hosting less expensive than having an uninteruptable power supply and other resources to ensure a web server is continually running, a server run on a normal desktop or laptop is usually best only to run the tests used to check new code is ok before being transfered to the public web server.
Bare Metal – Hosted: “Pizza Boxes”
‘Hosted’ is an organisation manages the computer you use as server. Just like the DIY, but someone else looks after the server. An organisation that looks after servers for people as a business.
In the racks in the picture to the left, the flat ‘pizza box’ shaped units are individual server devices. In fact these are devices specifically designed as web server devices for use in a server room with many other similar devices. Bare metal hosting is renting a computer which is quite like one of the individual ‘pizza box’ computers in the racks in the picture. Instead of sitting at the computer, operation done is through the interet, using either web page or a command line terminal. The server facility provides maintainance, internet connection, power supply may also provide backup and other services, but bare metal is having control of your own individual computer.
Pizza Box Sharing – ‘Slices’ or VPS (Virtual Private Servers)
An alternative to renting a complete ‘pizza box’ server device, is to rent the equivalent of an individually wrapped slice of pizza: a Virtual Private Server. On any computer you have you can run a ‘virtual machine‘ which a program the provides an emulated computer, running inside your computer. Now imagine running serveral of these virtual machines on the one computer.
Your site is hosted on a ‘virtual computer’ that is one of serveral virtual computers runing on a single physicial computer. In fact, there are probably a whole group of hosted physical computers, each of which is running several emulated computers. So, for example, with ten pizza box servers in a rack, there could 100 virtual computers if each physical computer runs 10 virtual computers.
The result is several emulated virtual computers, that appear as individual computers to the people using them, but they all are running on the one physical computer. For each person who gets a VPS, it seems just like getting a compete server computer, but in fact it is just a share of a computer. A computer that is emulating serveral computers. Each VPS must run its own operating system, its own filesystem, and has its own memory and storage. Just as when you cut a pizza into slices, each slice is complete in what you get. All the taste of your own pizza, but in a smaller quantity at a lower price. So each is slice appear like a complete computer that can run any programs you like on any operating system you choose, but most often the goal is to run the web applications, and the web applications usually includes running a webserver.
There are some restrictions to what is available in each ‘slice’ or VPS, but these are rarely relevant for web applications. No application on the VPS can take over an fully run any of the connections to the computer as these must be shared between all the VPS ‘slices’ of the computer. Access to the low level of the computer than manages the slices is also blocked, so no VPS can divide its computing power further to create VPS within VPS. Normally each VPS is allocated a fixed percentage of the computing power of the complete ‘pizza box’ server, so even if other VPS servers are not busy, only those computing resources allocated to that VPS are available. Further, there is also some overhead in swithing between VPS servers, so if there were four VPS slices or ‘virtual computers’ running on the same bare metal server. on the computerperformance is reduced.
Pizza Box Sharing – Shared Hosting
Another approach to sharing is called shared hosting. This approach is to take a computer, which could be either a VPS virtual compter, or a complete ‘bare metal’ computer, and share the web server software running on that computer. The one program is in fact running serving several web sites. These web sites are all using the same program and so following the same overall behaviour, but with different data for each web site.
Imagine management of a shopping mall offering to run a web site for each store in the mall. Each store able to have its own look and choose its own web pages for that stores section of the mall. The web server program could even be configured so each store was able to have its own news pages, even its own product catalog, and be able to have sales from the web site, and keep customer accounts for the stores own customers. All as separate sections of the one overall web site, run on a single software web server for the mall. All that is requires is to add a domanin name pointing to the relevant section of the mall web site, and then the store effectively now has its own wirtual website. This spreads the cost of across the stores, but because the entire mall is hosted on the one web server, if any one shop has an overload of web traffic, all stores will be effected.
To continue with the pizza box metaphor, this the equivalent of a solution for people who don’t want the pizza, they just want toppings. Individual stores just want web site, not a server device, or even a VPS virtual server device, either of which requires the ability and mean the responsiblity of installing and runing and maintaining programs of the server and keeping a computer system up to date and secure.
What is provided by just having a share, is a sever computer already set up and running, with a web server software that can run multiple web sites, all with the one instance of the server software. Like everyone having access to the toppings of a pizzas, the sharing only works if there is enough to go around, but for many applications, a complete device is overkill. Sharing in this way does not control as tightly ‘who has what’ between those sharing as with VPS. More flexibility, but less control for those sharing.
Web Server programs are not the only software that cal allow sharing. Email Servers can also allow sharing, so every store in our ‘virtual mall’ can have not just its own web site with customer look and feel and pages, own its products, and customer database, but also its own email addresses. The limitations are that the programs to run these customer databases and products etc are usually restriced to ‘off the shelf’ programs. Customising beyond “of the shelf” is in control of the manager of the virutal mall. This owner of the virtual mall hast to set up and run the server ready for sharing, and generally will have all the websites on the resulting server as customers. The businesses or individuals with shared websites, will normally require some support from the ‘vitual mall’ owner.
Pizza Slices – Sharing Slices!
It is possible to run a large number of sites by having one or more VPS servers, and then running sharing web server programs to allow multiple sites on each of those VPS servers. This can make the orginal ‘Pizza Box’ serve many web sites! Given a huge percentage of websites see almost zero traffic, and sometimes just having a website is the goal, many sites need very little computing power. Will the site be visited by multiple people every second, or is a few people a day more realistic? Also note, the fear that a single web site on shared hosting will consume all the resources is overstated. This is because the site with all the traffic, is both the cause and the main victim. Consider a group low usage sites all hosted in the same group as one site taking which is really busy and making every site slow. The really busy site will be one most affected once things get slow, so that really busy site will have the most complaints, and as it will have the biggest problem, it is the busiest site the first needs to find more resources.
Pizza Slices & Sharing: WHM/CPanel
There are different perspectives to shared multiple sites sites on a single web server program
- managing serveral shared sites for use by one person or business
- managing several shared sites for clients of a web site building business
- having a shared site as a client of a web site building business
The most common tool for managing shares side for all these cases is WHM (Web Host Manager), and the most common tool for configuring and controlling an individual share is the companion program cPanel. WHM is a program for those who have either a bare metal server or a Virtual Private Server to manage, and they wish to allow provide for multiple web sites to share that server device. WHM provides management of shared web sites on the server, and cPanel allows controlling the individual shares.
So WHM is accesible only by the managers of entire set of shares, and the companion program CPanel, which manages an individual shared website within a WHM managed server system, is accesible by both the manager of the overall server device as well as those using the individual shares. The combination of these two programs is really only needed for running multiple shared sites on a server, but as it does make automate configuration, and even those with a single public web site might actually want more than one web site, so they can have one or more private practice web sites in addition to their public ‘production’ site. WHM/CPanel can find many uses. Note that if using a shared site, as opposed to managing the shares, then CPanel is the only administration tool.
Misleading use of the term ‘cloud’
Sometimes cloud can seem like simply another name for the internet, but don’t be misled: there is a genuine revolution called cloud. There is no official authority for enforce how terminology such as ‘cloud’ is used, so sometimes ‘in the cloud’ is stretched in mean for anything ‘on the internet’, however there is a far more specific meaning of ‘Cloud’. Traditional server providers are likely to label services as cloud when they simply mean traditional web hosting, but there is a significant difference, and hosting as described above does not really consititude ‘cloud’. So while ‘cloud’ can sometimes seem like another name for the internet, the specific idea of cloud is far more profound.
Superceding the Server: What is Cloud?
Think of each server as a water droplet, and if you have sufficient servers, then you have a ‘cloud’. Something running on a single server, as with all the cases dicussed previously, can be though of as running on a ‘droplet’, not on actually on the overall cloud.
A traditional server farm has a large number of servers, with ‘clients’ allocated to specific servers. These can be virtual servers, which are slices of a physical server, or an application can be allocated to an entire ‘pizza box’ server’. For an even larger application, perhaps 2 or 3 severs may be allocated to run parts of the application. In that traditional server facility, each ‘client’ has a specific computer(s) or a share of a specific computer. The agreement is for use of a specific ‘slice of a computer’, or for one or more specific computers. The computer (or slice) is ‘yours’, for your use.
But with cloud, the servers are not allocated in this way. All the servers work together as one team of pooled resources and together form ‘the cloud’.
A computing task (such as serving a request for your web site) is allocated to the entire cloud and then a ‘cloud managing program’ dynamically allocates that task, or pieces of that task, to computers within the cloud. So parts of the task could be handled by different actual servers as part of the team of servers, or cloud, every time that task needs to run. Answering an http request is a task, and when there are no requests the cloud will allocate no resources, but when an http request arrives the cloud can effectively as for volunteers (servers) to perform part of the workload until that task is done. The more requests for any http server, the more of the cloud resources that will be made available to answer the requests, but spare time when there are no requests is simply available to the cloud to focus on other work.
The cloud analgy and name is perhaps not the best analagy to think of how the computers all working together create a combined intelligence more powerful than any one individual server. Another analgy is a to think of a single server as like a single neuron, and cloud of connected servers as the equivalent of the brain. A single neuron is not of itself intelligent, and it takes billions of connected neurons to generate intelligence. Another analogy is to think of the difference between and individual bee and the the intelligence of a bee colony.
The cloud is about the collective group of computers being consided not as individual servers, but as resources with cpu cycles, memory, storage etc all existing as huge pool of combined resources. Applications running ‘in the cloud’ use the resources they use require when they require them, rather than having a present amount of allocated resources. This has the promise of being more cost effective, far more scalable as applications are not limited the resources of a single server, and flexibility as more power is available whenever it is needed.
Containers, Docker & Kubernetes
In place of running on a ‘server’ device, appplication in the cloud run in a ‘container‘. While there are alternatives, the open source ‘docker‘ is the the best known container system, and application can be wrapped in a docker container in order to then be run in th cloud. The whole cloud concept requires ‘containers’.
You can run docker (or alternative container types) on your own Windows or your own Mac desktop, or laptop computer, to enabling running container applications. Each container application with docker runs in an evironment based on linux rather than windows or MacOS, even if your computer is not a linux computer. Running docker on single computer is not the point, but it does allow testing and getting a feel for how the cloud comes together. Note that linux, Windows and MacOs are all complete operating systems, and as a docker contain need only the small central ‘kernel’, not a complete Operating System, so the overhead is not like running multiple copies of linux on your computer. Other services beyond the kernel are present only if required by the container. A container is far more ‘light weight’ than an entire computer or virutal computer, so running things on a single computer in container such as docker is not necessarily a problem, but it is not the point either. It takes more than one computer to leverage the full power of containers.
Running containers across a group computers to form a ‘cloud’ of computers requires a ‘container orchestration’ program, and the open source Kubernetes is best known however others such as Docker Swarm also perform the same function. Kubernetes also contains control panels to provide a ‘cPanel’ for the cloud, but unlike cPanel, Kubernetes us open source, is available for free, and has support from the entire industry.
While you can get very low cost shares of pizza slices, public cloud can be even more economical and could qualify for a free tier with a cloud provider. Free tiers are limited, as expertise more epensive as knowlege of cloud commands higher pay rates than knowlege ‘longer established’ ways of doing things. So cloud could in the end be more or less expensive, but it will increase flexibility and most likely increase significantly increase security over shared hosting. The more recent the cost comparison, for the very small hosting requirements, the more the equation favours cloud. The larger the requirements, the more the equation favours cloud. The writing is on the wall for traditional hosting. Cloud also opens a whole new world of possibilities, with the cloud effectively becoming a neural network of ‘serverless functions’, ioT (internet of things) functionality, and the on demand nature ultimate delivers greater cost efficiencies. The current cloud is a collective intelligence to tap into that is like a baby. As connections grow, so will the power available by leveraging the cloud.
Simply combine a group of computers under the management of Kubernetes, and you have a private cloud. The same requirements apply as with bare metal servers, in that to be a practical commerical system with a private cloud, as opposed to just a test system, an uniteruptable power supply, full time maintenance staff and multiple connections to the internet. You could technically have just one computer with either private cloud or a a private bare bones server, but neither is economically attractive, and neither alone gives that pproterty of combined intelligent . To run a commerical private cloud for a large business, as scale increases the advantages over bare metal servers increase, but generally other than as a testing or learning tool, the economics mean you need a compelling reason not to use a public cloud and/or be a very large operation.
Running tests on a single private server, or even a public cloud, how do you access all that computing power available in the cloud? The anwer is a hybrid cloud, tapping into public cloud resources and the overall ‘hive intelligence’ utilising services already running in the public cloud, but keeping whatever it was desired to keep in the private cloud the specific developments that have those requirements.
Serverless is a somewhat confusing name for small services that run in cloud. A computer of the cloud is required, but neither the programmer, nor the program code in question, has to do anything about the server. The code is commissioned to run in the cloud and just runs in the ether….or (at least usually) the ethernet.
Such software services almost can almost appear organic in nature living in the cloud and interacted with on demand. The services offered by ‘serverless’ functions can be AI and can tap enourmous resources for almost zero cost. For example free tier cloud services can allow as much as 1,000,000 usages per month at no cost. An example wil probably be added in a future page but look out for this term in the future of computing and in playing a role in transforming just what computers can achieve.
Costing & Pricing: Which Choice?
DIY: This can be the least expensive, as long as you do know how to Do It Yourself, and you have a spare computer and an internet connection. However, it will normally produce a very unproffessional result, with website down time resulting from the lack of uninteruptable power, mulitple high speed internet upload connections and people being on demand to attend to any problems.
Bare Metal Pizza Box: . , Pizza Boxes and Sharing.
Logically of the server choices, it would make the most sense for shared server to be the most cost effective. Unless you have a new idea and are doing something on the internet that no other web site does, then shared server will be the lowest cost, both in terms of server and people to help.
This choice relies on the fact that thousands of web sites have basically the same software needs, just different data. The data allows for custom look and feel per site as well as custom ‘blog entries’ or ‘products and prices’. As these ‘blog’ and ‘store’ are the two common types of web sites, either of these, or a combination of the two, are strong candidates for shared server. The critical the data, the most custom the software that is desired, the more likely another choice will be superior.
As business scales, then a more expensive server solution can make sense anyway, as can having your own resources to design your site and add those very custom features that no one else has and differentiate your site from the rest of the web.
How many people will visit your site per minute? With traditional web servers, in all categories, the scale of your solution is determined by what level of service you wish to provide in the busiest possible time. What it 10 people all wish to transact with your website at the exact same time? With shared servers or when usign virtual servers, the respose can be ten times slower with 10 online than with one person online, because the multiprocessing of the computer is already at work serving multiple sites. A complete server will suffer less degradation as the users climb in number, but response times for the busiest case still need to be considered.
Each server request will require a certain amount of computer resources time, and there must be sufficient resources to handle the busiest time. Consideration must be given to what causes peak time. If 1000 people visit per month, will they be spread evenly during over evey hour night and day of the entire month, or migh simething trigger 10 people to happen to make a request in the very same minute?
No matter what type of site, cloud has the potential to be more cost effective.
If you are google or someone like that, your servers are busy 24 hours a day every day and with such traffic that peak periods will not be such a chnage, so the wasted time is not that bad, and yet even a google can save with cloud. But for us ‘no so big as google yet’ types, there can be huge savings as Cloud does not need to price on the capacity available at time, but instead how much of the cloud resources actually get used.
Consider if you have capacity to serve just 3 people at once one the server. (Considering one could be a web crawel that might only be 2 customers at once). If each takes request takes 15 seconds, you would have the capacity to server 60*2*24*30 = 172,000 requests per month, with is generally at least 172 times what server scaled for serving 3 customers would expect. In other words, the webserver must spend the vast majority of its time waiting for possible requests, if the server is capable of handling requests when they come. Most of the computing power is wasted with the computer being idle.
With cloud, there could be the potential for a cost reduction in that example of factor of 172…and while cloud providers may not pass on that full cost reduction, but paying for only the computer power you actually use, the price of having an ‘online’ presence ready to respond at any time is dramatically reduced.
By dynamically allocating resouces within the entire cloud, in place of just the resources within one pizza slice, the cloud can deliver an entirely different level of cost effectiveness. In fact as many as 1,000,000 requests per month can be available on free tiers with cloud, yet dedicated servers to handle a peak level possible with 1,000,000 requests per month can be quite expensive.
Conclusion & Summary.
Cloud is a new step forward with cost savings and new horizons for computing. If you are working with servers, it is well worth getting on board with cloud.
Without cloud, a web site requires a ‘web server’ computer and application. Any computer can be a web server, or run a number of web servers. Each web server computer and application combination can also serve a number of websites, as many websites require the same software, just different data.
With cloud, there is no server computer, just a ‘cloud’, or pool of computing power, and server programs are dynamically loaded and run within that cloud. There is some extra technology as you applications have to be able to be placed in ‘containers’, but the payback is too big to ignore.