Wednesday, December 2, 2015

Bandwidth

Bandwidth


In computing, bandwidth is the bit-rate of available or consumed information capacity expressed typically in metric multiples of bits per second. Variously, bandwidth may be characterized as network bandwidth, data bandwidth, or digital bandwidth.
This definition of bandwidth is in contrast to the field of signal processing, wireless communications, modem data transmission, digital communications, and electronics, in whichbandwidth is used to refer to analog signal bandwidth measured in hertz, meaning the frequency range between lowest and highest attainable frequency while meeting a well-defined impairment level in signal power.
Network bandwidth capacity
The term bandwidth sometimes defines the net bit rate (aka. peak bit rate, information rate, or physical layer useful bit rate), channel capacity, or the maximum throughput of a logical or physical communication path in a digital communication system. For example, bandwidth tests measure the maximum throughput of a computer network. The maximum rate that can be sustained on a link are limited by the Shannon-Hartley channel capacity for these communication systems, which is dependent on the bandwidth in hertz and the noise on the channel.
Network bandwidth consumption
Bandwidth in bit/s may also refer to consumed bandwidth, corresponding to achieved throughput or goodput, i.e., the average rate of successful data transfer through a communication path. This sense applies to concepts and technologies such as bandwidth shaping, bandwidth management, bandwidth throttling, bandwidth cap, bandwidth allocation (for example bandwidth allocation protocol and dynamic bandwidth allocation), etc. A bit stream's bandwidth is proportional to the average consumed signal bandwidth in Hertz (the average spectral bandwidth of the analog signal representing the bit stream) during a studied time interval.

Channel bandwidth may be confused with useful data throughput (or goodput). For example, a channel with x bps may not necessarily transmit data at x rate, since protocols, encryption, and other factors can add appreciable overhead. For instance, a lot of internet traffic uses the transmission control protocol (TCP), which requires a three-way handshake for each transaction. Although in many modern implementations the protocol is efficient, it does add significant overhead compared to simpler protocols. Also, data packets may be lost, which further reduces the useful data throughput. In general, for any effective digital communication, a framing protocol is needed; overhead and effective throughput depends on implementation. Useful throughput is less than or equal to the actual channel capacity plus implementation overhead.

Asymptotic bandwidth
The asymptotic bandwidth (formally asymptotic throughput) for a network is the measure of maximum throughput for a greedy source, for example when the message size (the number of packets per second from a source) approaches infinity.

Asymptotic bandwidths are usually estimated by sending a number of very large messages through the network, measuring the end-to-end throughput. As other bandwidths, the asymptotic bandwidth is measured in multiples of bits per seconds

Multimedia bandwidth
Digital bandwidth may also refer to: multimedia bit rate or average bitrate after multimedia data compression (source coding), defined as the total amount of data divided by the playback time.

Bandwidth in web hosting
In Web hosting service, the term bandwidth is often[6] incorrectly used to describe the amount of data transferred to or from the website or server within a prescribed period of time, for example bandwidth consumption accumulated over a month measured in gigabytes per month.[citation needed] The more accurate phrase used for this meaning of a maximum amount of data transfer each month or given period is monthly data transfer.

A similar situation can occur for end user ISPs as well, especially where network capacity is limited (for example in areas with under developed internet connectivity and on wireless networks).

ASCII

ASCII

From Wikipedia, the free encyclopedia
Not to be confused with MS Windows-1252 or other types of Extended ASCII.
This article is about the character encoding. For other uses, see ASCII (disambiguation).
ASCII chart from a 1972 printer manual (b1 is the least significant bit)
ASCII (Listeni/ˈæski/ ass-kee), abbreviated from American Standard Code for Information Interchange,[1] is acharacter-encoding scheme (the IANA prefers the name US-ASCII[2]). ASCII codes represent text in computers,communications equipment, and other devices that use text. Most modern character-encoding schemes are based on ASCII, though they support many additional characters. ASCII was the most common character encoding on the World Wide Web until December 2007, when it was surpassed by UTF-8, which includes ASCII as a subset.[3][4][5]
ASCII developed from telegraphic codes. Its first commercial use was as a seven-bit teleprinter code promoted by Bell data services. Work on the ASCII standard began on October 6, 1960, with the first meeting of the American Standards Association's (ASA) X3.2 subcommittee. The first edition of the standard was published during 1963,[6][7]underwent a major revision during 1967,[8][9] and experienced its most recent update during 1986.[10] Compared to earlier telegraph codes, the proposed Bell code and ASCII were both ordered for more convenient sorting (i.e., alphabetization) of lists, and added features for devices other than teleprinters.
Originally based on the English alphabet, ASCII encodes 128 specified characters into seven-bit binary integers as shown by the ASCII chart on the right.[11] The characters encoded are numbers 0 to 9, lowercase letters a to z, uppercase letters A to Z, basic punctuation symbolscontrol codes that originated with Teletype machines, and aspace. For example, lowercase j would become binary 1101010 and decimal 106. ASCII includes definitions for 128 characters: 33 are non-printing control characters (many now obsolete)[12] that affect how text and space are processed[13] and 95 printable characters, including the space(which is considered an invisible graphic.

Archieve

archive


noun
1.
Usually, archivesdocuments or records relating to the activities,business dealings, etc., of a person, family, corporation, association,community, or nation.
2.
archives, a place where public records or other historical documentsare kept.
3.
any extensive record or collection of data:
The encyclopedia is an archive of world history. The experience wassealed in the archive of her memory.
4.
Digital Technology.
  1. a long-term storage device, as a disk or magnetic tape, or acomputer directory or folder that contains copies of files for backupor future reference.
  2. a collection of digital data stored in this way.
  3. a computer file containing one or more compressed files.
  4. a collection of information permanently stored on the Internet:
    The magazine has its entire archive online, from 1923 to thepresent.
verb (used with object)archived, archiving.
5.
to place or store in an archive:
to vote on archiving the city's historic documents.
6.
Digital Technology. to compress (computer files) and store them in asingle file.
Origin of archive
1595-1605
1595-1605; orig., as plural < French archives Latin archī (va Greekarcheîa, orig. plural of archeîon public office, equivalent to arch ()magistracy, office + -eion suffix of place