ASCII is an acronym you may have heard referring to computer text, but it’s a term that’s quickly gaining popularity thanks to a more powerful newcomer
You are watching: what is ascii used for
5 min read
ASCII is an acronym you may have heard referring to computer text, but it’s a term that’s quickly falling out of use thanks to a more powerful newcomer. But what is ASCII and what is it used for?
What does ASCII stand for?
Perhaps the easiest place to start is with the acronym itself. So let’s expand it:
A merican S standard C ode for I information < strong>I nterchange
This sip of a sentence doesn’t really paint the full picture, but some parts offer some immediate clues, especially the first two words. ASCII is an American standard whose importance will soon become apparent.
“Code for Information Interchange” suggests that it is a format for the back and forth transmission of data. In particular, ASCII deals with text data: characters that make up words in a human-readable language.
ASCII solves the problem of how letters and other characters are assigned values so that when they are represented as ones and zeros stored in a file can be translated back into letters when the file is later read. When different computer systems agree on the same code to use, this information can be reliably exchanged.
Related: How to find symbols and look up their meaning span>
The History of ASCII
Sometimes referred to as US-ASCII, ASCII was an American innovation that emerged in the 1960s was developed. The standard has undergone several revisions since ASCII was last updated, most notably in 1977 and 1986.
Extensions and variations have built on ASCII over the years, largely to reflect the fact that ASCII has many Omits characters used or even required by languages other than US English. ASCII doesn’t even honor the British currency symbol (“£”), although the pound is included in Latin-1, an 8-bit extension developed in the 1980s that also encodes several other currencies.
ASCII has been significantly expanded and superseded by Unicode, a much broader and more ambitious standard discussed below. In 2008, Unicode overtook ASCII’s popularity for online use.
What characters does ASCII represent?
For one Computers are just as unfamiliar with the letter “A” as with the color purple or the feeling of jealousy. Computers deal in ones and zeros, and it’s up to humans to decide how to use those ones and zeros to represent numbers, words, images, and everything else.
You can think of ASCII as Morse code present to the digital world – at least the first attempt. While Morse code only represents 36 different characters (26 letters and 10 digits), ASCII was designed to represent up to 128 different characters in 7 bits of data.
ASCII is case sensitive, ie it will 52 uppercase and lowercase letters from the English alphabet. Along with the same 10 digits, that’s about half the space used.
Punctuation, mathematical and typographic symbols take up the rest, and a collection of control characters, which are special non-printable codes with functional meanings – see below for more information.
Here are some typical characters ASCII encodes:
Binary Decimal Character 010 0001 33 ! 011 0000 48 0 011 1001 57 9 011 1011 59 ;; 100 0001 65 A 100 0010 66 B 101 1010 90 Z 101 1011 91 [ 110 0001 97 a 110 0010 98 b 111 1101 125 }}
Note that the selected values have some useful properties, in particular: p>
- Letters of the same case can always be sorted numerically since they are in the correct order. For example, A has a lower value than B, which has a lower value than Z.
- Different case letters are offset by exactly 32. This simplifies the translation between lower and upper case letters, since only a single bit has to be switched for each letter in both cases.
Aside from letters, punctuation marks, and digits, ASCII can represent a number of control characters, special code points that do not produce single-character output, but instead provide alternative meanings for the data to the data they may consume.
For example, ASCII 000 1001 the horizontal tab character. It represents the space you get when you press the TAB key. Usually such signs are not shown directly, but their effect is often shown. Here are some more examples:
Binary Decimal Character 000 1001 9 Horizontal Tab 000 1010 10 Line Feed 001 0111 23 End of Transmission Block
What about other characters? h2>
ASCII was hugely successful in the early days of computing because it was simple and widely used. However, in a world with a more international perspective, a single writing system will not do the trick. Modern communication must be possible in French and Japanese – every language in which we want to store text.
The Unicode character set can address a total of 1,112,064 different characters, although currently only about a tenth of them are actually defined is. That might sound like a lot, but the encoding not only aims to accommodate tens of thousands of Chinese characters, but also covers emoji (almost a thousand and a half) and even extinct writing systems like Jurchen.
Related: < span rel="follow" target="_self">Top 100 Emojis Explained
Unicode recognized ASCII’s dominance in selecting the first 128 characters: they are exactly the same as ASCII . This allows ASCII encoded files to be used in situations where Unicode is expected, ensuring backwards compatibility.
The ASCII text represents the 26 letters of the English alphabet with numbers, punctuation marks and some other symbols. It served its purpose very well for the better part of half a century.
It has now been superseded by Unicode, which supports a variety of languages and other symbols, including emoji. For all practical purposes, UTF-8 is the encoding that should be used to represent Unicode characters online.
See more information related to the topic what is ascii used for
Computer Science for Everyone – 12 – What is ASCII?
- Author: teclado
- Post date: 2014-07-13
- Ratings: 4 ⭐ ( 3850 Ratings )
- Match search results: Computer Science for Everyone – 12 – What is ASCII?
“Computer Science for Everyone” is an online course that covers all of the most important concepts in computer science, from hardware, to algorithms, to programming and data structures.
In this video we look at how we can encode letters and symbols as binary numbers in the computer. This is extremely useful, or else we would not be able to use any letter when using a computer! In this video we explain how this is done with the American Standard Code for Information Interchange (ASCII). This method is a bit obsolete now, but the same principles apply with other methods that are now more common.
If you would like to learn about Python, check out our course on Udemy:
— Links —
✅ Subscribe to the channel: https://www.youtube.com/tecladocode?sub_confirmation=1
✅ The Discord server: https://discord.gg/78Nvd3p
✅ Twitter: https://twitter.com/tecladocode
✅ Instagram: https://instagram.com/tecladocode
✅ Facebook: https://facebook.com/tecladocode
What is ascii code used for? – Musicofdavidbowie.com
- Author: musicofdavidbowie.com
- Ratings: 5 ⭐ ( 8441 Ratings )
- Match search results:
What Is ASCII & What Is ASCII Used For? (+ PDF Table)
- Author: techwithtech.com
- Ratings: 3 ⭐ ( 6644 Ratings )
- Match search results: This is about ASCII. Learn what ASCII is and how ASCII is used with this in-depth article. Plus, the COMPLETE master ASCII table as PDF file. Let’s get started!
ASCII (American Standard Code for Information Interchange) Definition
- Author: techterms.com
- Ratings: 5 ⭐ ( 4027 Ratings )
- Match search results: The definition of ASCII defined and explained in simple language.
ASCII | Definition, History, Trivia, & Facts
- Author: www.britannica.com
- Ratings: 5 ⭐ ( 2959 Ratings )
- Match search results: ASCII, in full American Standard Code for Information Interchange, a standard data-encoding format for electronic communication between computers. ASCII assigns standard numeric values to letters, numerals, punctuation marks, and other characters used in computers. Before ASCII was developed, different makes and models of computers could not communicate with one another. Each computer manufacturer represented alphabets, numerals, and other characters in its own way. IBM (International Business Machines Corporation) alone used nine different character sets. In 1961 Bob Bemer of IBM submitted a proposal to the American National Standards Institute (ANSI) for a common computer code. The X3.4 committee, with representation
- Author: www.geeksforgeeks.org
- Ratings: 4 ⭐ ( 5044 Ratings )
- Match search results: A Computer Science portal for geeks. It contains well written, well thought and well explained computer science and programming articles, quizzes and practice/competitive programming/company interview Questions.
What is ASCII (American Standard Code for Information Interchange)?
- Author: www.techtarget.com
- Ratings: 4 ⭐ ( 4779 Ratings )
- Match search results: ASCII (American Standard Code for Information Interchange) defines data encoding on the internet. Find out what ASCII is, how it works and how to use it.
See more articles in this category: Computer tips