In most common use, a server is a physical computer (a computer hardware system) dedicated to run one or more services (as a host), to serve the needs of the users of other computers on a network. Depending on the computing service that it offers it could be a database server, file server, mail server, print server, web server, gaming server, or some other kind of server.
In the context of client-server architecture, a server is a computer program running to serve the requests of other programs, the "clients". Thus, the "server" performs some computational task on behalf of "clients". The clients either run on the same computer, or they connect through the network.
In the context of Internet Protocol (IP) networking, a server is a program that operates as a socket listener.
Servers often provide essential services across a network, either to private users inside a large organization or to public users via the Internet.
The term server is used quite broadly in information technology. Despite the many server-branded products available (such as server versions of hardware, software or operating systems), in theory any computerised process that shares a resource to one or more client processes is a server. To illustrate this, take the common example of file sharing. While the existence of files on a machine does not classify it as a server, the mechanism which shares these files to clients by the operating system is the server.
Similarly, consider a web server application (such as the multiplatform "Apache HTTP Server"). This web server software can be run on any capable computer. For example, while a laptop or personal computer is not typically known as a server, they can in these situations fulfill the role of one, and hence be labelled as one. It is, in this case, the machine's role that places it in the category of server.
In the hardware sense, the word server typically designates computer models intended for hosting software applications under the heavy demand of a network environment. In this client–server configuration one or more machines, either a computer or a computer appliance, share information with each other with one acting as a host for the other[s].
While nearly any personal computer is capable of acting as a network server, a dedicated server will contain features making it more suitable for production environments. These features may include a faster CPU, increased high-performance RAM, and increased storage capacity in the form of a larger or multiple hard drives. Servers also typically have reliability, availability and serviceability (RAS) and fault tolerance features, such as redundancy in power supplies, storage (as in RAID), and network connections.
Servers became common in the early 1990s as businesses increasingly began using personal computers to provide services formerly hosted on larger mainframes or minicomputers. Early file servers housed multiple CD-ROM drives, which were used to host large database applications.
Between the 1990s and 2000s an increase in the use of dedicated hardware saw the advent of self-contained server appliances. One well-known product is the Google Search Appliance, a unit that combines hardware and software in an out-of-the-box packaging. Simpler examples of such appliances include switches, routers, gateways, and print servers, all of which are available in a near plug-and-play configuration.
Modern operating systems such as Microsoft Windows or Linux distributions rightfully seem to be designed with a client–server architecture in mind. These operating systems attempt to abstract hardware, allowing a wide variety of software to work with components of the computer. In a sense, the operating system can be seen as serving hardware to the software, which in all but low-level programming languages must interact using an API.
These operating systems may be able to run programs in the background called either services or daemons. Such programs, such as the aforementioned Apache HTTP Server software, may wait in a sleep state for their necessity to become apparent. Since any software that provides services can be called a server, modern personal computers can be seen as a forest of servers and clients operating in parallel.
The Internet itself is also a forest of servers and clients. Merely requesting a web page from a few kilometers away involves satisfying a stack of protocols that involve many examples of hardware and software servers. The least of these are the routers, modems, domain name servers, and various other servers necessary to provide us the world wide web.
The introduction of Cloud computing allows server storage and other resources to be shared in a pool and provides servers with a higher degree of fault tolerance.
Hardware requirements for servers vary, depending on the server application. Absolute CPU speed is not usually as critical to a server as it is to a desktop machine[citation needed]. Servers' duties to provide service to many users over a network lead to different requirements such as fast network connections and high I/O throughput. Since servers are usually accessed over a network, they may run in headless mode without a monitor or input device. Processes that are not needed for the server's function are not used. Many servers do not have a graphical user interface (GUI) as it is unnecessary and consumes resources that could be allocated elsewhere. Similarly, audio and USB interfaces may be omitted.
Servers often run for long periods without interruption and availability must often be very high, making hardware reliability and durability extremely important. Although servers can be built from commodity computer parts, mission-critical enterprise servers are ideally very fault tolerant and use specialized hardware with low failure rates in order to maximize uptime, for even a short-term failure can cost more than purchasing and installing the system. For example, it may take only a few minutes of down time at a national stock exchange to justify the expense of entirely replacing the system with something more reliable. Servers may incorporate faster, higher-capacity hard drives, larger computer fans or water cooling to help remove heat, and uninterruptible power supplies that ensure the servers continue to function in the event of a power failure. These components offer higher performance and reliability at a correspondingly higher price. Hardware redundancy—installing more than one instance of modules such as power supplies and hard disks arranged so that if one fails another is automatically available—is widely used. ECC memory devices that detect and correct errors are used; non-ECC memory is more likely to cause data corruption.
To increase reliability, most servers use memory with error detection and correction, redundant disks, redundant power supplies and so on. Such components are also frequently hot swappable, allowing technicians to replace them on the running server without shutting it down. To prevent overheating, servers often have more powerful fans. As servers are usually administered by qualified system administrators, their operating systems are also more tuned for stability and performance than for user friendliness and ease of use, Linux taking a noticeably larger percentage than for desktop computers.[citation needed]
As servers need a stable power supply, good Internet access, increased security and are also noisy, it is usual to store them in dedicated server centers or special rooms. This requires reducing the power consumption, as the extra energy used generates more heat thus causing the temperature in the room to exceed acceptable limits; hence normally, server rooms are equipped with air conditioning devices. Server casings are usually flat and wide (typically measured in "rack units"), adapted to store many devices next to each other in a server rack. Unlike ordinary computers, servers usually can be configured, powered up and down or rebooted remotely, using out-of-band management, typically based on IPMI.
Many servers take a long time for the hardware to start up and load the operating system. Servers often do extensive pre-boot memory testing and verification and startup of remote management services. The hard drive controllers then start up banks of drives sequentially, rather than all at once, so as not to overload the power supply with startup surges, and afterwards they initiate RAID system pre-checks for correct operation of redundancy. It is common for a machine to take several minutes to start up, but it may not need restarting for months or years.