Contents - Previous - Next

This is the old United Nations University website. Visit the new site at

6. Use of databases

6.1 Paper Products

Not very long ago, the principal product produced from databases was in printed form. Bibliographic bulletins, under various names, contained the whole or part of the database. Their manufacture and distribution did not require any special advanced technology, and the revenue from the subscriptions, received in advance, represented a financial guarantee for the producers. The products, still widely distributed, are nevertheless being replaced little by little by other methods of accessing the stored information. Microforms, for which a short-term fashion led to the thought that they might replace paper products, have not gone beyond the status of storage media. If they do not disappear altogether, they will instead become an adjunct to paper products.

6.2 On-line Access

On-line distribution of databases began at the end of the 1960s as a part of timesharing systems. The 1970s saw the beginning of their growth. The first search systems such as STAIRS, RECON, and ELHILL/ORBIT are still recognized names. The National Library of Medicine and NASA were pioneers in the field. The systems were at the time limited by poor telecommunications and by prohibitive storage costs.

Nevertheless, the market began to emerge, as did the idea of hosts, organizations that take responsibility for the distribution of databases produced by others. These hosts provide search systems, storage media, networks, and user training, in general in collaboration with the producers. The hosts can be classified either as supermarkets or as specialists. Some producers are at the same time hosts.

Although relations between database producers and hosts are generally good, there is beginning to be a tendency towards competition or even sometimes confrontation (a recent example illustrates this). A recent article by Harry F. Boyle of Chemical Abstracts Service, published in the ICSTI Proceedings, 1991, on the relations between hosts and guests describes in detail what currently happens.

It is appropriate to note that competition between producers is significant in that more than 5,000 databases are available on 800 hosts. The users are becoming more demanding with respect to the services that they want to see. Some of them are beginning to feel that existing on-line systems, based on the Boolean model, are by nature limited. It is true that several other models have been proposed (vectorial, probabilistic, extended Boolean, fuzzy set) that aim to improve the performance of the Boolean model, but none has yet developed into a large-scale commercial application.

Josephine Maxon-Dadd of Dialog Information Services, in Trends in Database Design and Customer Services, published by NFAIS, has described the ideal database:

Great currency
Clean data
An easy link to full text
Controlled vocabulary (hierarchical) maintained and updated over the whole file
Uncontrolled vocabulary too, perhaps for trade names, proper names, or synonyms
Title, a reasonable number of authors, a good abstract
Bibliographic data fully identified and searchable
Complete coverage of every journal title included
No internal duplicates
Subject classification scheme (text and code searchable)
Cited references
Numeric indexing
User-friendly scientific notation
Multilingual indexing

I think that this eloquent list should make all database producers pause to think, above all when the same databases are more and more used to produce derivative products or are the subject of more and more sophisticated processing.

6.3 CD-ROM

The CD-ROM (compact disc read-only memory) is a database distribution medium that was introduced some years ago for the storage of texts and graphics. It exhibits much the same advantages as the microfiche stores of earlier decades of texts. The disks are relatively easy to produce and to duplicate; they are also easy to ship from place to place, and can therefore be used for local storage of databases and for local retrieval activities. CD-ROMs also provide high-density storage for both text and graphics. A standard disk will store up to 600 million bytes of information.

When coupled with a personal computer, the potential of the medium is greatly enhanced. However, CD technology is somewhat hampered when the size of the database requires more than a disk. In this case, effecting a complete search of the database currently requires the user to change disks or to use jukeboxes.

6. 4 Floppy Disks

The floppy disk information delivery medium is emerging as an option for personal computers. Disks are easy to manufacture and can be produced inhouse. They can be produced for a variety of operating systems.

6.5 New Methods of Access to Information

Most attempts to improve access to information contained in databases are aimed at moving from documentation to information. In fact, on-line information is little used by companies despite their needs. According to recent figures, databases provide only 7 per cent of the total information processed by companies.

According to Nicolas Grandjean of Synthélabo, what the user really wants is the answer. Real user-friendliness is the relevance of the reply to the question asked, not to the information request. Databases give raw information, where the answer is hidden in primary documents. In addition, the answer is often complex, in that it requires the correlation between several documents. Grandjean does not think that we can stay with these relatively unsophisticated information systems, above all with the volume of information available today. New techniques now under study are providing answers by conceiving a new dimension to information systems.

The sum of the documents contained in a database possesses properties independent of the documents taken separately. These properties can be exploited, both in themselves and to design tools for aiding indexing and searching.

7. Bibliometry applied to STI or scientometry

7.1 Definitions

Bibliometry was defined by A. Pritchard in 1969 as "the application of mathematics and statistical methods to books, articles, and other means of communication." "Scientometry" is specialized bibliometry applied to the STI field. A third term, "infometry," was adopted by the FID in 1987 to designate the group of metric activities relevant to information (thus covering both bibliometry and scientometry).

7.2 The Functions of Scientometry

The functions of Scientometry are the analysis, evaluation, and graphic representation of STI by means of statistical methods, mathematics, and data analysis. The analysis has the objective of answering the question, "Who is doing what, and where?" from STI. The evaluation that can be done on STI is of two types, one being "metric evaluation" of information flow (articles and journals), the other being "quality evaluation" of information processed in databases. As for graphic representation, this aims to present STI in the form of maps containing both research fields and the participants (researchers, institutions, countries). The objective is to provide a representation of the structure of information at any given time of its development.

7.3 Documentation and information

We know that the amount of stored documentation is growing exponentially, but its informational content is only growing linearly. Remember that documents and information are not the same thing. By asking the question of the analysis of information and its representation, what we are trying to do is to map the knowledge structures of this information, and not simply to count the documents. It is necessary, however, to present the information in the framework of a relevant cognitive structure. This is why it is important to be able to represent, with the aid of bibliometric techniques, such a framework based on the knowledge contained in the scientific literature.

7.4 Analysis Techniques

The technique used for this purpose is the method of formation of keyword clusters; this allows the structuring of the information and permits its processing in hypertext form, in that, because each cluster (or group of interlinked keywords) indexes a certain number of bibliographic references, this is a means of organizing the information thematically, and thus represents a knowledge structure. Instead of looking through a body of information in sequential order, a simple list of references, a series of bibliographic citations, we have here a method of following a thematic order that can be constructed from the bibliographic data themselves.

The advantage of the use of bibliometric techniques is that it does not involve classification codes that have previously been assigned and fixed. The development of research and its organization can be followed as they are presented in the scientific literature.

7.5 The Contribution of Linguistics

Many researchers are working on the construction of a new generation of knowledge bases, with the objective of using them in documentary computing. Others are applying themselves to improving access to full-text databases in order to allow multilingual searching, or to showing that it is possible to apply artificial intelligence techniques to database searching.

8. Hypertext

Ted Nelson proposed in 1967 the term hypertext as non-sequential writing that it would be inconvenient to produce or represent on paper. The structure of a hypertext database does seem to provide the flexibility that is required to create a database that would enable researchers to search for information, to follow associations in ways that reflect their normal information-seeking activities. However, the implementation of such a database would also need adequate guidance facilities and more powerful text retrieval capacities.

In the context of information retrieval, awareness of the potential of hypertext needs to be accompanied by recognition of the problems involved in developing effective hypertext-based solutions to retrieval problems. For example, standard indexing techniques employing either controlled or uncontrolled vocabularies can also be employed to help the user navigate the database or identify appropriate areas for searching. Unfortunately, the same problems are encountered with the use of controlled and uncontrolled vocabularies for indexing and retrieval in hypertext as in any other application. Controlled vocabularies require work in their preparation and updating, and users need to index their entries manually, while uncontrolled vocabularies suffer from proliferation of index entry terms and from problems with synonyms and homographs.

9. Multimedia

Databases nowadays cannot be discussed without mentioning multimedia. This concept is so wide that it is nearly impossible to fix its limits. Multimedia systems, which should in due course allow combined manipulation, from a single workstation, of text, sound, and images, should bring together a number of different technologies. The market for this type of product is still unknown. Documentary searching, which in 1989 was almost the only sector concerned with these technologies, should reach a global turnover of US$4 billion by 1994. Nevertheless, it seems that the tools currently available are not adequate for large applications. We know how to navigate in small graphs held in memory but not in multimedia databases of several gigabytes.

10. Economic problems

10.1 The Position in the Economy

Activities related to database production and distribution were for a long time considered as scientific activities managed by an exchange system, but the current situation is very different. The explosion of information technologies and their rapid distribution into every aspect of economic life has put back onto the agenda a thought already raised in the 1960s. The emergence of new skills, creating many jobs in the database field, has kindled research and discussions. An American researcher, M.V. Porat, carried out a significant statistical study in 1977 in which he estimated that by 1967, 46 per cent of the American GNP was already related to information activities. Using Porat's method, J. Voge estimated that in 1984, information workers represented between 40 and 47 per cent of the workforce of the major industrialized countries. It is true that information production includes not only activities of design and transmission, but also of identification and integration by the user. It is at the same time process and product. Database production and distribution are integral parts of these analyses. Numerous studies have been carried out on the value of information and its special features since it is considered to be an economic connection.

The difficulty of agreeing on some basic concepts and the amount of work carried out on controlling the use and duplication of information demonstrates very well its special features. The flow of international exchanges in the DB field is also the subject of major economic studies carried out by the IMO and certain countries. It is also interesting that in the United States, American DB are mainly used, with only 10 per cent of foreign DB; in Japan, however, 75 per cent of the databases are foreign; in Europe the proportion is 18 per cent.

10.2 The Costs

Regardless of the discussions and studies on the place of databases and data banks in the economy, their production and distribution represent a significant cost. DB production costs fall into four categories: direct production costs, manufacturing costs, indirect costs, and administrative costs.

Production costs include very large staff costs for the library-related and conceptual processing of documents. Analysis and indexing are operations requiring scarce, highly qualified staff that generate high costs. It is in these areas that producers are trying to achieve savings by using author abstracts and by encouraging studies on assisted or automatic indexing techniques.

A significant element, possibly the largest, of production costs is allocated to the purchase of sources. Some producers try to minimize these costs by agreements with publishers or libraries in order to acquire the sources free or in exchange for their own products. The data-processing costs involved in database production vary according to the techniques used and on the size of the base. They also vary with the complexity of the system in the case of various types of cooperation on database creation, which require numerous interfaces.

Manufacturing costs essentially include magnetic tape production and the operations required to put them into the different formats of the hosts, the production of derivative products, such as publications containing all or part of the DB, CD-ROM, and diskettes.

Indirect costs include the development costs of new methods and products, promotional costs, marketing, user training and assistance, staff training, and user documentation.

Overhead, as for other activities, varies according to the facilities offered.

10.3 Pricing

Although most of the participants in the information world agree in considering information as a resource, a product, pricing problems are still treated outside real economic considerations. One of the proofs of this is that most database producers, except for business databases, are not-for-profit organizations, and many of these databases are government-subsidized. Pricing problems are also treated differently depending on the products produced from the databases. In fact, what tends to be called the "market price" still plays a role in determining prices.

On-line pricing is most often determined by the hosts, whose own pricing policies are also developing. For some years the division of revenue between database producers and hosts has been the subject of discussion, sometimes acrimonious, and has led to confrontation between the major participants of the two professions.

It is not the object of this paper to enter into the details of pricing, which are extremely complex. It can simply be noted that one of the critical points for database producers is the establishment of a sales and pricing strategy. Whether the strategies are defined by market sector or by product group, or whether price reductions are foreseen for particular users or to create the market, database pricing needs a clear definition of the target revenues by the producer. This requires, evidently, an exact knowledge of the cost elements for each product, and a full knowledge of the market, the competition, and the cross-relations between different subproducts of the same database.

11. Ownership, legislation, and copyright problems

Because of the late recognition by countries and even by producers of the economic value of databases, they are not at present clearly protected by current legislation. The legislation that does exist is varied and in some instances can even impair the establishment and operation of a real market.

Moreover, new technologies and the increasingly widespread availability of localized equipment allow pirating, which, even if not organized in most cases, is very dangerous for database producers. Some could even disappear for this reason.

The Commission of the European Community presented on 29 January 1992, after it had been adopted by the Council, a draft directive relating to the legal protection of databases. The directive is to come into force on 1 January 1993. By databases, the Commission understands a collection of works or deposited material, stored and accessible by electronic means; i.e. on-line data banks are concerned, but more particularly databases, no matter what media are used (except paper), e.g. CD-ROM, videodisc, CDI, etc. This directive is thus essential, as much for producers and publishers of traditional DB as for those in the multimedia and electronic publishing markets. Coming after the Green Paper on copyright and the challenge of technology, published in 1988, this document expresses the European concern about dealing with the problems of intellectual property.

In spite of these efforts, the problems remain complex at all levels. Relations with scientific journal publishers must be settled in the case of full-text databases and the digitization of author abstracts used without added value. Relations with hosts are of a different type but require contracts and licences appropriate for the special situation of DB distribution. These points are examined in detail in a book published by the NFAIS, Guide to Database Distribution. Here again, technology creates problems that are difficult to resolve.

The opening of gateways between hosts creates complications that few contracts have so far taken into account. This must be done.

12. Conclusion

In this paper I have tried to show the role and the importance of databases as a component of the economy and as a tool for research and industry. However, this field is in continual technological evolution and in its spectacular growth cannot live in isolation. Firstly there must be thought on how to organize production and development, in order to be always up to date with the technologies. This means a continual monitoring of the quality and security of the data that are produced. DB operation must then be organized in an environment where the actors frequently change their roles. Database production depends heavily on the availability of sources, which means managing the relationship with publishers. DB distribution depends on hosts and users. Relations with hosts must be closely monitored, although they are generally without particular problems. The user is the person who justifies the existence of the database. He also evolves, from the experienced professional to the end user, who has neither the time nor the inclination to experiment with systems that are too sophisticated. It therefore seems essential that in future attention be directed towards him.


AFNOR standard 247-102, August 1978. General Principles for Indexing Documents.

AFNOR standard 274-100, December 1981. Rules for Creating Monolingual Thesauri.

Allen, B. (1982). "Recall cues in known-item retrieval." Journal of the American Society for Information Science 40 (4): 246-252.

Allen, K.J. (1988). "Online Information and the Needs of Intermediaries." In: Proceedings of the 1988 International Online Information Meeting. Learned Information Europe 1: 161169.

Barbarino, M. (1989). "Similarity Detection in Online Bibliographic Databases." In: Proceedings of the 1989 International Online Information Meeting. Learned Information Europe, 111-117.

Berry, J.N. (1991). "The Politics of and Expectations for the White House Conference on Library and Information Services: Lurching Toward . . . Washington." Library Journal 1991 (15): 32-35.

Chaumier, J., and M. Dejean (1992). "Computer-assisted Indexing: Principles and Methods." Documentaliste 29 (1): 3-6.

Chaumier, J., and M. Dejean (1990). "L'indexation documentaire: de l'analyse conceptuelle humaine a l'analyse automatique morpho-syntaxique." Documentaliste 27 (6) 275-279.

Cotter, G.A. (1988). "Global Scientific and Technical Information Network." In: Proceedings of the 1988 International Online Information Meeting. Learned Information Europe 2: 611-618.

Czarnota, B. (1991). "The New Europe." AGARD Lecture Series 181, 9 p."Database Protection Directive" (1992). Infotecture Europe 203.

Detemple, W. (1988). "Future Enhancements for Full-Text-Graphics, Expert Systems and Frontend Software." In: Proceedings of the 1988 International Online Information Meeting. Learned Information Europe 1: 271-278.

Dreidemy, P. (1991). "Alerte chez les serveurs." Videotex et RNIS Magazine (62): 25-35.

Dreidemy, P. (1990). "L'évolution du metier de serveur: L'heure des choix marketing et technologiques." Videotex et RNIS Magazine (50): 41-45.

Efthimiadis, E.N. (1990). "Progress in Documentation. Online Searching Aids: A Review of Front Ends, Gateways and Other Interfaces." Journal of Documentation 46 (3): 218-262.

Elias, A.W. (1989). "Copyright, Licensing Agreements and Gateways." Information Services and Use (9): 347-361.

Fayen, E.G. (1989). "Loading Local Machine-readable Data Files: Issues, Problems, and Answers." Inf. Technol. Libr. 8 (2): 132-137.

Holtham, C. (1989). "Information Technology Management into the 1990's: A Position Paper." J.l. T. 4 (4), 17 p.

ICSTI Symposium Proceedings: Information, la quadrature du cercle. May 1991, Nancy, France.

"Information Transfer" (1982). In: ISO Standards Handbook 1, 2nd. edition, 521 p.

Kennedy, H.E. (1992). "Global Information Trends in the Year 2000." In: 1992 STICA Annual Conference, March 1992.

Laubier C. (de), and J. Scolary (1991). "Enquête: les bibliothèques face au NTI." NTI (29): 21-23.

"La logique d'interrogation des barques de données remise en question" (1989). Bases (7): 6-7.

Luctkens, E. (1991). "SIGIR '90: le point de vue de l'utilisateur de systèmes documentaires. "Cahier de la documentation (1), 11 p.

Lyon, E. (1991). "Spoilt for Choice? Optical Discs and Online Databases in the Next Decade." Aslib 25 (1): 37-50.

Marchetti, P.G., and G. Muehlhauser (1988). "User Behaviour in Simultaneous Multiple File Searching." In: Proceedings of the 1988 International Online Information Meeting. Learned Information Europe 1: 41-49.

McCallum, S.H. "Standards and Linked Online Information Systems." LRTS 34 (3): 360-366.

McLelland, J. (1990). "Computers, Databases and Thesauri."Aslib Proceedings 42 (7-8): 201-205.

Melody, W.H. (1986). "The Context of Change in the Information Professions. "Aslib Proceedings 38 (8), 8 p.

Montviloff, V., and W. Löhner (1989). "The General Information Programme: The Year, Achievements and Future Prospects." Int. Forum Inf. and Docum 14 (4): 3-9.

"Multimedia: la Babel technologique." (1991). 01 Informatique (1162): 44-54.

O'Neill, E.T. (1988). "Quality Control in Online Databases." In: Annual Review of Information Science and Technology 23. White Plains, New York: Knowledge Industries, pp. 125-126.

"Protection des données: la proposition de directive du Conseil Européen" (1990). Infotecture (213).

"Rapport de l'IMO: 1048 barques de données en Europe et 2214 aux USA" (1991). Infotecture (218).

Rasmussen, A.M., and B.A. Hubbard (1988). "Business Information from Databases: A European Perspective." In: Proceedings of the 1988 International Online Information Meeting. Learned Information Europe 1: 149-159.

Rivier, A. 1990. "Construction des langages d'indexation: aspects théoriques." Documentaliste 27 (6): 263-274.

Rosen, L. (1990). "CD-Networks and CD-ROM: Distributing Data on Disk." Online (July): 102-105.

Ryan, H.W. (1991). "Open Systems: A Perspective on Change." Journal of Information Systems Management 1991: 62-66.

Salton, G. (1988). "Thoughts about Modern Retrieval Technologies." Information Services and Use 8: 107-113.

Schipper, W. (1990). "Through the NFAIS Looking Glass: The Year 2000." Int. Forum Inf. and Docum. 15 (3): 29-31.

Schipper, W., and B. Unruh (1990). Trends in Database Design and Customer Services. NFAIS report series.

Schmitt, R. (1991). "L'information, un prodigieux mineral a extraire." Le Monde Informatique 1991 (11): 35-39.

Schwuchow, W. (1991). "The Situation of Online Information Services Industry in the European Community (with Special Consideration of the FRO). Int. Forum Inf. and Docum. 16 (1): 610.

Scott, P. (1990). "The Organisational Impact of the New Media." Aslib Proceedings 42 (9): 219-225.

Seigle, D.C. (1989). "System Integration in an Image Intensive Environment." In: Proceedings of SICED '89. Paris: Informatics.

Simmons, P. (1990). "Serial Records, International Exchanges and the Common Communication Format." IFLA Journal 16 (2): 198-203.

"So Many Databases, So Little Time." (1991). Infotecture Europe 1991 (195).

Stamper, R., K. Liv, M. Kolman, P. Klarenberg, F. Van Slooten, Y. Ades, and C.

Van Slooten (1991). "From Database to Normabase." International Journal of Information Management 1991(11): 67-84.

Stern, B.T. (1991). "Adonis Revisited." NFAIS Newsletter 33 (11): 140.

Straub, D.W., and J.C. Wetherbe (1989). "Information Technologies for the 1990's: An Organizational Impact Perspective." Communications of the ACM 32 (11): 1328-1339.

Strozik, T. (1990). "Managing Technology for Today's Library Service." The Bookmark 48 (3): 188-193.

"L'Utilisation des CD-ROM dans les bibliothèques en Europe: où en eat-on?" Bases 1989 (45).

Woods, L.B., T. Willis, D. Chandler, B. Manois, and P. Wolfe (1991). "International Relations and the Spread of Information Worldwide." Int. Libr. Rev. 1991 (23): 91-101.

Communication networks

1. Introduction
2. The narrow-band ISDN
3. Broad-band ISDN
4. Concluding remarks

Takahiko Kamae


A fully digitized communication network, the Integrated Services Digital Network (ISDN), has been expanding and promoting multimedia communication. Video telephone and video conferencing are expected to grow rapidly.

B-ISDN will be an infrastructure in the twenty-first century. New information-offering services may take advantage of B-ISDN.

1. Introduction

ISDN is a fully digitized communication network that is expected gradually to take over the telephone network. ISDN was standardized in detail by the International Telegraph and Telephone Consultative Committee (CCITT), and thus the world-wide connection of ISDN should be easy.

In Japan, ISDN was made available commercially by Nippon Telegraph and Telephone Corporation (NTT) in 1988. Since then, ISDN has been growing and is now available almost all over the country. Multimedia computing is a popular topic among people involved with personal computers and workstations. Multimedia computers are believed to take full advantage of ISDN. In other words, ISDN will promote multimedia communication.

ISDN is based on synchronous transfer mode (STM) technology. To improve multimedia communication features, novel technology, called the asynchronous transfer mode (ATM), is being developed in many countries. CCITT has been standardizing the broad-band ISDN (B-ISDN) on the basis of ATM. One of the most important parts of B-ISDN is the "fibre-to-the-home" (FTTH) concept.

In FITH, optical fibre cables are extended to customer premises; specifically, optical fibres will replace the metallic twisted pairs now being used for subscriber loops. B-ISDN is expected to push telecommunications strongly toward multimedia services, covering up to high-definition TV (HDTV).

This paper describes the state of the art in ISDN and future trends relevant to B-ISDN.

2. The narrow-band ISDN

2.1 User-Network Interface

The present ISDN is generally referred to as "narrow-band ISDN" (NISDN) to distinguish it from B-ISDN. N-ISDN has two kinds of user-network interface (UNI): the basic-rate interface (BRI) and the primary-rate interface (PRI). BRI has two 64Kb/s channels, called B. and one D channel, and thus it is frequently called the 2B+D interface. PRI has the bitrate 1.536Mb/s, and can be divided into B (64Kb/s), Ho (384Kb/s), and D (64Kb/s). Typical division is 23B+D, and thus it is frequently called the 23B+D interface. However, when high bitrate is necessary, the whole bitrate 1.536Mb/s can be used as one channel, and it is called Hii channel.

The D channel is mainly used for control between user terminals and the network. All information is "packetized" in D channel. A customer can use D channel as a packet communication channel; in this case, control information and customer information are packet-multiplexed in D channel. The packet switching service is also provided through B channel. In this case, the whole bitrate (64Kb/s) in B channel can be assigned to customer packets.

In the long run, ISDN offers switched 64Kb/s service, including telephone, switched 384Kb/s service, switched 1.536Mb/s service, and switched packet service, through both B and D channels. Furthermore, ISDN has various features suitable to multimedia communication.

2.2 Multimedia Communication Features

One of the biggest differences between ISDN and the telephone network is that ISDN is receptive to various communication media, while the telephone network is strongly biased to the telephone service. ISDN is equipped with universal communication protocols according to the standard OSI reference model. The layer 1 interface of ISDN consists of BRI and PRI. The layer 2 and layer 3 protocols go through D channel. The layer 2 protocol mainly establishes the data link between a terminal and the network. The layer 3 protocol is very important to multimedia communication.

Among various commands in the layer 3 protocol, bearer capability, low layer compatibility, and high-layer compatibility commands are closely related to multimedia communication. Using the bearer-capability command, a terminal requests to the network the kind of channel that is necessary; namely, the bitrate, B. Ho, or Hii, and speech, audio, or unrestricted 64Kb/s in the case of B. Low-layer compatibility is used to select end-to-end transfer capability. High-layer compatibility is used for match-making of terminals at both ends; namely, high-layer compatibility specifies the kind of terminals: telephone, facsimile, telex, teletex, MHS, etc.

In many cases, various kinds of terminals, e.g. telephone, facsimile, and videotex, are connected to a BRI. When a call is originated by a Group 4 facsimile, the transfer bitrate is specified as unrestricted 64Kb/s and the kind of terminal as a facsimile using the high-layer compatibility. At the receiving side, terminals different from a facsimile do not respond, and thus only a facsimile gives a response.

Thus, various kinds of terminals connected to the same BRI can communicate independently with the same kind of terminals at the receiving end. This feature is very valuable to multimedia communication.

2.3 ISDN Application to Video Telephone and Video Conferencing

Figure 1 shows the standard model of video telephone/video conferencing system. For standardization of video codec, a common interface format (CIF) for TV signals was defined to enable the interconnection of the 525/60 (North American, Japanese) and 625/50 (European) TV systems, as shown in figure 2. The standard codec encodes TV signals with CIF. The interchange between a national TV standard and the CIF can be done freely in each country. For low-cost video codec, a quarter CIF (QCIF) was also defined.

Figure 1 Standard video telephone/video conferencing system

Figure 2 International connection through the Common Interface Format (CIF)

Table 1 shows typical usages of ISDN channels to video telephone/video conferencing services. The use of B or 2B is likely suitable to video telephone, and Ho or Hii to video conferencing. The combination of QCIF, 48Kb/s video, and 16Kb/s audio may facilitate offering low-cost video telephone service.

Figure 3 shows a one-board video codec developed by NTT. This codec can be used with B and 2B in table 1. Its functions cover NTSC/CIF conversion, video codec, and transmission codec. A 16Kb/s audio codec can be mounted as a child board on this board.

Figure 4 shows a codec for 384Kb/s video. Four newly developed DSPs are used in the former and eight in the latter.

2.4 Colour Picture Transmission and Colour Facsimile

The standard colour picture codec, based on the JPEG standard, was developed as shown in figure 5. This codec has an interface to the VME bus and can encode a picture having maximally 8192 pixels in one direction. By assigning bitrates suitable to brightness and colour difference signals, the data compression rate can come down to 1/20 without noticeable degradation. An optical disk memory and the JPEG codec board are attached to a UNIX workstation to constitute a colour picture filing system, as shown in figure 6. A colour copier attached to the workstation can be used as an I/O terminal. Colour pictures are transmitted through ISDN. A JPEG codec is useful to save transmission time and storage capacity.

Table 1 Bitrates for video telephone/video conferencing

Channel Division Coding
H11 B(audio) + 23B(video) audio: SB-ADPCM PCM
H0 B(audio) + 5B(video) video: hybrid coding
  B(audio) + B(video)  
2B B(audio) + 112k(video) audio: LD-CELP
B 16k(audio) + 48k(video) video: hybrid coding

Figure 3 64K/128Kbls video codec (28 cm x 28 cm)

The JPEG standard will stimulate the standardization of colour facsimile. Important points of colour facsimile may be:

- Interworking with existing facsimiles, particularly with Group 4 facsimile;
- the same resolution as Group 4 facsimile in the black and white portion;
- application of JPEG for the full colour portion;
- distinction of the black and white and the full colour portions in the scanning process; and
- application of the standard open document architecture to distinguish the black and white portion from the full colour portion.

These should be guidelines for standardizing colour facsimile.

Figure 4 384Kb/s video codec

Figure 5 JPEG codec

Figure 6 Still colour picture filing and transmission

Contents - Previous - Next