Skorzystaj z wyszukiwarki lub indeksu alfabetycznego.
Przykłady: pci, /dev/null, functional unit, embedded system, pseudo-tty, nfs.
1 definition found
From The Free On-line Dictionary of Computing (05 January 2017) [foldoc]:
Unicode
1. A 16-bit character set standard, designed and
maintained by the non-profit consortium Unicode Inc.
Originally Unicode was designed to be universal, unique, and
uniform, i.e., the code was to cover all major modern written
languages (universal), each character was to have exactly one
encoding (unique), and each character was to be represented by
a fixed width in bits (uniform).
Parallel to the development of Unicode an ISO/IEC
standard was being worked on that put a large emphasis on
being compatible with existing character codes such as ASCII
or ISO Latin 1. To avoid having two competing 16-bit
standards, in 1992 the two teams compromised to define a
common character code standard, known both as Unicode and
BMP.
Since the merger the character codes are the same but the two
standards are not identical. The ISO/IEC standard covers only
coding while Unicode includes additional specifications that
help implementation.
Unicode is not a glyph encoding. The same character can be
displayed as a variety of glyphs, depending not only on the
font and style, but also on the adjacent characters. A
sequence of characters can be displayed as a single glyph or a
character can be displayed as a sequence of glyphs. Which
will be the case, is often font dependent.
See also Jörgen Bettels and F. Avery Bishop's paper {Unicode:
A universal character code
(http://research.compaq.com/wrl/DECarchives/DTJ/DTJB02/DTJB02SC.TXT)}.
(2002-08-06)
2. A pre-Fortran on the IBM 1130, similar to
MATH-MATIC.
[Sammet 1969, p.137].
(2004-09-14)