Senin, 05 April 2010
Client-server computing or networking
Client-server computing or networking is a distributed application architecture that partitions tasks or work loads between service providers (servers) and service requesters, called clients.
Often clients and servers operate over a computer network on separate hardware. A server machine is a high-performance host that is running one or more server programs which share its resources with clients. A client does not share any of its resources, but requests a server's content or service function. Clients therefore initiate communication sessions with servers which await (listen to) incoming requests.
The client-server characteristic describes the relationship of cooperating programs in an application. The server component provides a function or service to one or many clients, which initiate requests for such services.
Functions such as email exchange, web access and database access, are built on the client-server model. For example, a web browser is a client program running on a user's computer that may access information stored on a web server on the Internet. Users accessing banking services from their computer use a web browser client to send a request to a web server at a bank. That program may in turn forward the request to its own database client program that sends a request to a database server at another bank computer to retrieve the account information. The balance is returned to the bank database client, which in turn serves it back to the web browser client displaying the results to the user.
The client-server model has become one of the central ideas of network computing. Many business applications being written today use the client-server model. So do the Internet's main application protocols, such as HTTP, SMTP, Telnet, DNS. In marketing, the term has been used to distinguish distributed computing by smaller dispersed computers from the "monolithic" centralized computing of mainframe computers. But this distinction has largely disappeared as mainframes and their applications have also turned to the client-server model and become part of network computing.
Each instance of the client software can send data requests to one or more connected servers. In turn, the servers can accept these requests, process them, and return the requested information to the client. Although this concept can be applied for a variety of reasons to many different kinds of applications, the architecture remains fundamentally the same.
The most basic type of client-server architecture employs only two types of hosts: clients and servers. This type of architecture is sometimes referred to as two-tier. It allows devices to share files and resources. The two tier architecture means that the client acts as one tier and application in combination with server acts as another tier.
The interaction between client and server is often described using sequence diagrams. Sequence diagrams are standardized in the Unified Modeling Language.
Specific types of clients include web browsers, email clients, and online chat clients.
Specific types of servers include web servers, ftp servers, application servers, database servers, name servers, mail servers, file servers, print servers, and terminal servers. Most web services are also types of servers.
File sharing is the practice of distributing or providing access to digitally stored information, such as computer programs, multi-media (audio, video), documents, or electronic books. It may be implemented in a variety of storage, transmission, and distribution models. Common methods are manual sharing using removable media, centralized computer file server installations on computer networks, World Wide Web-based hyperlinked documents, and the use of distributed peer-to-peer (P2P) networking.
File sharing is not of itself illegal. However, the increasing popularity of the mp3 music format in the late 1990s led to the release and growth of Napster and other software that aided the sharing of electronic files. This in practice led to a huge growth in illegal file sharing: the sharing of copyright protected files without permission.
Although the original Napster service was shut down by court order, it paved the way for decentralized peer-to-peer file sharing networks such as Gnutella, Gnutella2, eDonkey2000, the now-defunct Kazaa network, and BitTorrent.
Many file sharing networks and services, accused of facilitating illegal file sharing, have been shut down[citation needed] due to litigation by groups such as the RIAA and MPAA. During the early 2000s, the fight against copyright infringement expanded into lawsuits against individual users of file sharing software.
The economic impact of illegal file sharing on media industries is disputed. Some studies conclude that unauthorized downloading of movies, music and software is unequivocally damaging the economy, while other studies suggest file sharing is not the primary cause of declines in sales. Illegal file sharing remains widespread, with mixed public opinion about the morality of the practice.
A peer-to-peer, commonly abbreviated to P2P, is any distributed network architecture composed of participants that make a portion of their resources (such as processing power, disk storage or network bandwidth) directly available to other network participants, without the need for central coordination instances (such as servers or stable hosts).
Peers are both suppliers and consumers of resources, in contrast to the traditional client-server model where only servers supply, and clients consume.
Peer-to-peer was popularized by file sharing systems like Napster. Peer-to-peer file sharing networks have inspired new structures and philosophies in other areas of human interaction. In such social contexts, peer-to-peer as a meme refers to the egalitarian social networking that is currently emerging throughout society, enabled by Internet technologies in general.
Peer-to-peer (P2P) is a term that originated from the popular concept of peer-to-peer computer application design, popularized by the large distributed file sharing systems, such as Napster, the first of its kind in the late 1990s. The concept has inspired new structures and philosophies in other areas of human interaction. In this context it refers to the meme of egalitarian social networking that is currently emerging throughout society, enabled by Internet technologies. This affords a critical look at current authoritarian and centralized social structures.
The peer-to-peer paradigm has been elucidated by Michel Bauwens in his thesis Peer to Peer and Human Evolution.
Peer-to-peer networks are typically formed dynamically by ad-hoc additions of nodes. In an 'ad-hoc' network, the removal of nodes has no significant impact on the network. The distributed architecture of an application in a peer-to-peer system provides enhanced scalability and service robustness.
Peer-to-peer systems often implement an Application Layer overlay network on top of the native or physical network topology. Such overlays are used for indexing and peer discovery. Content is typically exchanged directly over the underlying Internet Protocol (IP) network. Anonymous peer-to-peer systems are an exception, and implement extra routing layers to obscure the identity of the source or destination of queries.
In structured peer-to-peer networks, connections in the overlay are fixed. They typically use distributed hash table-based (DHT) indexing, such as in the Chord system (MIT)
Langganan:
Posting Komentar (Atom)
Tidak ada komentar:
Posting Komentar