The most basic tele-service supported by GSM is telephony. As with all other communications, speech is digitally encoded and transmitted through the GSM network as a digital stream. There is also an emergency service, where the nearest emergency-service provider is notified by dialling three digits (similar to 911).
A variety of data services is offered. GSM users can send and receive data, at rates up to 9600 bps, to users on POTS (Plain Old Telephone Service), ISDN, Packet Switched Public Data Networks, and Circuit Switched Public Data Networks using a variety of access methods and protocols. Since GSM is a digital network, a modem is not required between the user and GSM network, although an audio modem is required inside the GSM network to interwork with POTS.
A unique feature of GSM, not found in older analog systems, is the Short Message Service (SMS). SMS is a bidirectional service for short alphanumeric (up to 160 bytes) messages. Messages are transported in a store-and-forward fashion. For point-to-point SMS, a message can be sent to another subscriber to the service, and an acknowledgement of receipt is provided to the sender. SMS can also be used in a cell-broadcast mode, for sending messages such as traffic updates or news updates.


GSM together with other technologies is part of an evolution of wireless mobile telecommunication that includes High-Speed Circuit-Switched Data (HSCSD), General Packet Radio System (GPRS), Enhanced Data rate for GSM Evolution (EDGE), and Universal Mobile Telecommunications Service (UMTS).
GSM Network Operators
T-Mobile and Cingular operate GSM networks in the United States on the 1,900 MHz band. GSM networks in other countries operate at 900, 1,800, or 1,900 MHz.
GSM Security: GSM security issues such as theft of service, privacy, and legal interception continue to raise significant interest in the GSM community (www.gsm-security.net).
In cellular service there are two main competing network technologies: Global System for Mobile Communications (GSM) and Code Division Multiple Access (CDMA).
Coverage: The most important factor is getting service in the areas you will be using your phone. Upon viewing competitors' coverage maps you may discover that only GSM or CDMA carriers offer cellular service in your area.
Data Transfer Speed: With the advent of cellular phones doing double and triple duty as streaming video devices, pod cast receivers and email devices, speed is important to those who use the phone for more than making calls. CDMA has been traditionally faster than GSM, though both technologies continue to rapidly leapfrog along this path. Both boast "3G" standards, or 3rd generation technologies.
Subscriber Identity Module (SIM) cards: In the United States only GSM phones use SIM cards. The removable SIM card allows phones to be instantly activated, interchanged, swapped out and upgraded, all without carrier intervention. The SIM itself is tied to the network, rather than the actual phone. Phones that are card-enabled can be used with any GSM carrier.
Roaming: For the most part, both networks have fairly concentrated coverage in major cities and along major highways. GSM carriers, however, have roaming contracts with other GSM carriers, allowing wider coverage of more rural areas, generally speaking, often without
roaming charges to the customer. CDMA networks may not cover rural areas as well as GSM carriers, and though they may contract with GSM cells for roaming in more rural areas, the charge to the customer will generally be significantly higher.


GSM (Global System for Mobile communication) is a digital mobile telephone system that is widely used in Europe and other parts of the world. GSM uses a variation of Time Division Multiple Access (TDMA) and is the most widely used of the three digital wireless telephone
technologies (TDMA, GSM, and CDMA). GSM digitizes and compresses data, then sends it down a channel with two other streams of user data, each in its own time slot. It operates at either the 900 MHz or 1,800 MHz frequency band.


During the early 1980s, analog cellular telephone systems were experiencing rapid growth in Europe, particularly in Scandinavia and the United Kingdom, but also in France and Germany. Each country developed its own system, which was incompatible with everyone else's in equipment and operation. This was an undesirable situation, because not only was the mobile equipment limited to operation within national boundaries, which in a unified Europe were increasingly unimportant, but there was also a very limited market for each type of equipment, so economies of scale and the subsequent savings could not be realized.
The Europeans realized this early on, and in 1982 the Conference of European Posts and Telegraphs (CEPT) formed a study group called the Groupe Sp├ęcial Mobile (GSM) to study and develop a pan-European public land mobile system.
In 1989, GSM responsibility was transferred to the European Telecommunication Standards Institute (ETSI), and phase I of the GSM specifications were published in 1990. Commercial service was started in mid-1991, and by 1993 there were 36 GSM networks in 22 countries. Although standardized in Europe, GSM is not only a European standard. Over 200 GSM networks (including DCS1800 and PCS1900) are operational in 110 countries around the world. In the beginning of 1994, there were 1.3 million subscribers worldwide, which had grown to more than 55 million by October 1997. With North America making a delayed entry into the GSM field with a derivative of GSM called PCS1900, GSM systems exist on every continent, and the acronym GSM now aptly stands for Global System for Mobile communications.


The modern electronic and telecommunications equipment permit us to reach one step further in transmission and recording of ECG signals. The new, miniature ECG machines having many possibilities and advantages are offered to physicians and patients. They are much smaller and lighter than standard ECG machines used in telemetry. It is not necessary to connect cumbersome ECG cables and to apply ECG electrodes, because a new recorder has sealed metal contacts. It is interesting that such recorders are made as common daily used small things, like wallet for credit cards, bills, etc.
The education of the patient is very simple. By simply placing the contacts against the chest the user can record a real-time ECG signal. The users can carry such ECG devices not only in their homes, but also while working, traveling or driving a car. Thus, no matter where the user is located, when he feels heart irregularities, he has the ability to record and to transmit the ECG signal by only pressing a button. The recorded data on some models can be completely viewed on a built-in small LCD monitor.

Twenty-four hours a day, all the year round, the centre receives ECG’s from many patients in a real-time, and the patients have a direct access to a cardiologist. Detailed real-time ECG waveforms are displayed on the centre monitors. The trained medical experts retrieve the patient’s medical history, receive his symptoms over the phone (standard or mobile), compare a transmitted ECG with a previously recorded ECG and interpret ECG. A quick initial analysis can be undertaken and instructions for the treatment and clinical advices can be given. If it is necessary, the position of the patient can be determined using GPS and emergency medical service can be contacted.


The methodology of this state-of-the-art design is to attach the leads of a portable Electronic circuit to the subjects' body while the other end of the circuit is interfaced with a cellular phone. After acquiring the bio-signals of electrocardiography from the body, the signal is filtered out by higher order low pass, band pass, high pass and band reject filters to remove the nugatory noise some frequencies of noise and other sort of radio wave interferences. The signal is then digitalized and converted to an appropriate packet form and then uploaded on the cellular network, via the cell phone, as SMS, in form of a text message. With practice, interpreting the ECG is a matter of pattern recognition and if the physicians or heart specialists are provided by the ECG in time, it can recuperate casualties and cessation of meritorious lives. Specialists and physicians can infer the cardiac disorders if provided by a recorded ECG waveform of about a minute span along with brief generic detail about the patient. Thus the transmitted ECG along with embedded brief generic information about subject can be received on either the physician


With the development of electronics and its application in medicine it is possible to transmit and process many vital parameters of the human body. The most important, and in this moment the most interesting signal for monitoring and analyzing is the electrocardiography (ECG) signal. For the patient suffering from the cardiac disease it is very important to perform accurate and quick diagnosis. For this purpose a continuous monitoring of the ECG signal and the patient’s current heart activity are necessary. In this medical field, a big progress has been achieved in last few years, especially by applying the latest generation of mobile phones.
Nowadays the ECG signal recording is usually performed in two main ways: by using the ECG machine in short time intervals during the examination by physician, or by using permanent 24 hours ECG recording and a later analysis (the so-called the Holter monitoring). The disadvantage of the first method is that it is not possible to have the complete diagnosis, and the shortcoming of the second method is that it is not possible to intervene immediately, which sometimes can be fatal.


The first crude ECG was described in 1903 by Willem Einthoven. His procedure has been expanded, and now the probes used to measure electricity (called electrodes) are placed on the right and left arms, left leg, and across the chest wall. Currently there are 12 standardized “leads” for the standard ECG: bipolar leads I, II and III, described by Einthoven, augmented unipolar leads aVR, aVL and aVF and unipolar precordial leads V1-V6. How do these “leads” work and what do they mean?
When current passes to the positive end of the bipolar (2-sided) electrode, it causes a positive deflection, which corresponds to an upward movement of the pen on the ECG paper. Passage of current away from the positive pole of the bipolar electrode causes a negative deflection and a downward movement of the pen on the ECG paper. Current flowing at an oblique angle to the electrode causes smaller deflection, and current flowing perpendicular to the electrode does not cause any deflection in the recorder. Thus each lead “sees” the heart in a different way. This information is recorded on paper as a series of deflections and waves.


An electrocardiogram (ECG) is the representation of the electrical activity of the heart (cardiac) muscle as it is recorded from the body surface. The heart is a muscular organ that beats in rhythm to pump the blood through the body. The signals that make the heart's muscle fibers contract come from the senatorial node, which is the natural pacemaker of the heart. In an ECG test, the electrical impulses made while the heart is beating are recorded and usually shown on a piece of paper. This is known as an electrocardiogram, and records any problems with the heart's rhythm, and the conduction of the heart beat through the heart which may be affected by underlying heart disease.
The muscle cells of the heart are linked so closely to one another that electrical impulses can easily spread from one cell to the next. Certain groups of cardiac cells are designed to rapidly transmit electrical activity through the heart. These specialized cells include the arterial conduction tracks, the atrioventricular (AV) node, the bundle of His, the bundle branches, and the distal ventricular conduction system.
The heart has some very specialized cells. The so-called automatic cells of the heart are capable of spontaneous depolarization. They are important in the generation of heart rhythm, and because of this they are also known as pace making cells.


Networked programs link tertiary care hospitals and clinics with outlying clinics and community health centers in rural or suburban areas. The links may use dedicated high-speed lines or the Internet for telecommunication links between sites.
• Point-to-point connections using private networks are used by hospitals and clinics that deliver services directly or contract out specialty services to independent medical service providers at ambulatory care sites. Radiology, mental health and even intensive care services are being provided under contract using telemedicine to delivery the services.
• Primary or specialty care to the home connections involves connecting primary care providers, specialists and home health nurses with patients over single line phone-video systems for interactive clinical consultations.
• Home to monitoring centre links are used for cardiac, pulmonary or fetal monitoring, home care and related services that provide care to patients in the home. Often normal
phone lines are used to communicate directly between the patient and the centre although some systems use the Internet.
• Web-based e-health patient service sites provide direct consumer outreach and services over the Internet. Under telemedicine, these include those sites that provide direct patient care.


Specialist referral services typically involves of a specialist assisting a general practitioner in rendering a diagnosis. This may involve a patient "seeing" a specialist over a live, remote consult or the transmission of diagnostic images and/or video along with patient data to a specialist for viewing later. Recent surveys have shown a rapid increase in the number of specialty and subspecialty areas that have successfully used telemedicine. Radiology continues to make the greatest use of telemedicine with thousands of images "read" by remote providers each year. Other major specialty areas include: dermatology, ophthalmology, mental health, cardiology and pathology.
According to reports and studies, almost 50 different medical subspecialties have successfully used telemedicine.
• Patient consultations using telecommunications to provide medical data, which may include audio, still or live images, between a patient and a health professional for use in rendering a diagnosis and treatment plan. This might originate from a remote clinic to a physician's office using a direct transmission link or may include communicating over the Web.
• Remote patient monitoring uses devices to remotely collect and send data to a monitoring station for interpretation. Such "home telehealth" applications might include a specific vital sign, such as blood glucose or heart ECG or a variety of indicators for homebound patients. Such services can be used to supplement the use of visiting nurses.
• Medical education provides continuing medical education credits for health professionals and special medical education seminars for targeted groups in remote locations.
• Consumer medical and health information includes the use of the Internet for consumers to obtain specialized health information and on-line discussion groups to provide peer-to-peer support.


Telemedicine is the use of medical information exchanged from one site to another via electronic communications to improve patients' health status. Closely associated with telemedicine is the term "telehealth" which is often used to encompass a broader definition of remote healthcare that does not always involve clinical services. Videoconferencing, transmission of still images, e-health including patient portals, remote monitoring of vital signs, continuing medical education and nursing call centers are all considered part of telemedicine and telehealth.
Telemedicine is not a separate medical specialty. Products and services related to telemedicine are often part of a larger investment by health care institutions in either information technology or the delivery of clinical care. Even in the reimbursement fee structure, there is usually no distinction made between services provided on site and those provided through telemedicine and often no separate coding required for billing of remote services.
Telemedicine encompasses different types of programs and services provided for the patient. Each component involves different providers and consumers


To dive into a specific technology at this point is getting a bit ahead of the story, though.
Wireless networks share several important advantages, no matter how the protocols are designed, or even what type of data they carry.
The most obvious advantage of wireless networking is mobility. Wireless network users can connect to existing networks and are then allowed to roam freely. A mobile telephone user can drive miles in the course of a single conversation because the phone connects the user through cell towers. Initially, mobile telephony was expensive. Costs restricted its use to highly mobile professionals such as sales managers and important executive decision makers who might need to be reached at a moment's notice regardless of their location. Mobile telephony has proven to be a useful service, however, and now it is relatively common in the United States and extremely common among Europeans. 1
As long as the wireless users remain within the range of the base station, they can take advantage of the network. Commonly available equipment can easily cover a corporate campus; with some work, more exotic equipment, and favorable terrain.
Wireless networks typically have a great deal of flexibility, which can translate into rapid deployment. Wireless networks use a number of base stations to connect users to an existing network.


Over the past five years, the world has become increasingly mobile. As a result, traditional ways of networking the world have proven inadequate to meet the challenges posed by our new collective lifestyle. If users must be connected to a network by physical cables, their movement is dramatically reduced. Wireless connectivity, however, poses no such restriction and allows a great deal more free movement on the part of the network user. As a result, wireless technologies are encroaching on the traditional realm of "fixed" or "wired" networks.
Wireless telephony has been successful because it enables people to connect with each other regardless of location. New technologies targeted at computer networks promise to do the same for Internet connectivity. The most successful wireless networking technology this far has been 802.11.

Normalization of Database

Normalization is a process for assigning attributes to entities. It reduces data redundancies and helps eliminate the data anomalies.
Normalization works through a series of stages called normal forms:
First normal form (1NF)
Second normal form (2NF)
Third normal form (3NF)
Fourth normal form (4NF)
The highest level of normalization is not always desirable.


Normalization is only one of many database design goals.
Normalized (decomposed) tables require additional processing, reducing system speed.
Normalization purity is often difficult to sustain in the modern database environment. The conflict between design efficiency, information requirements, and processing speed are often resolved through compromises that include denormalization.

Database System Utilities

File Reorganization
Performance monitoring

Database Classifications

Single-user vs. Multiuser
Centralized vs. Distributed
Data Model

Data Models

A data model is a description of the structure of a database. Data models generally, fall into 3 categories according to the level of the description.

High-level ( or conceptual) data models closely approximate the miniworld.
Representational ( or, implementational ) data models are intermediate data models close to the miniworld but also reflecting the actual organization of data in the database.
Low-level ( or, physical) data models describe details of physical storage, generally transparent to casual or parametric end users.

A data model is often specified by a database schema typically displayed in a schema diagram that consists of schema constructs.

Database State

The actual data in a data base at a particular instant is the database state, which consists of a set of instances for each schema construct.
Defining a database consists of specifying a schema to the DBMS. We then have a database in an empty state, with no data.
When data is first loaded, the database is in its initial state.
Subsequently, each update creates another state. The DBMS must guarantee that each such state is a valid state that satisfies schema specifications.
The schema is the intension, while a database state is the extension of the schema.

When not to use a DBMS

Main costs of using a DBMS:
- High initial investment in hardware, software,training
and possible need for additional hardware.
- Overhead for providing generality, security, recovery, integrity, and concurrency control.
Generality that a DBMS provides for defining and processing data.
When a DBMS may be unnecessary:
If the database and applications are simple, well defined, and not expected to change.
If there are stringent real-time requirements that may not be met because of DBMS overhead.
If access to data by multiple users is not required.

Advantages of Using DBMs

Controlling Redundancy in data storage and in development and maintenance efforts. ‧duplication efforts ‧waste space ‧inconsistent Restricting Unauthorized Access (security and authorization)
Providing Persistent Storage for Program Objects and Data Structures.
Permitting Inference and Actions Using Rules
Providing Multiple User Interfaces
Representing Complex Relationships Among data.
Enforcing Integrity Constraints
Providing Backup and Recovery

Implications of the Database Approach

Potential for Enforcing Standards.
-Reduced Application Development Time.
-Availability of Up-to-date Information.
-Economies of Scale.

Characteristics of the Database Approach

File Processing
Each user defines and implements the files needed for a specific application
Redundancy in defining & storing data
Database Approach
A single repository of data
-Self-describing nature of a database system: A DBMS catalog stores the description of the database. The description is called meta-data . This allows the DBMS software to work with different databases. catalog: structure of each file, type & storage format of each data item, constraints on data
Insulation between programs and data: Called program-data independence.
Allows changing data storage structures without having to change the DBMS access programs.

Data Abstraction: A data model is used to hide storage details and present
the users with a conceptual view of the database.
Abstract operation(OODB)
Support of multiple views of the data: Each user may see a different view of the
database, which describes only the data of interest to that user.
subset or virtual data

Characteristics of the Database Approach

File Processing
Each user defines and implements the files needed for a specific application
Redundancy in defining & storing data
Database Approach
A single repository of data
-Self-describing nature of a database system: A DBMS catalog stores the description of the database. The description is called meta-data . This allows the DBMS software to work with different databases. catalog: structure of each file, type & storage format of each data item, constraints on data
Insulation between programs and data: Called program-data independence.
Allows changing data storage structures without having to change the DBMS access programs.

Data Abstraction: A data model is used to hide storage details and present
the users with a conceptual view of the database.
Abstract operation(OODB)
Support of multiple views of the data: Each user may see a different view of the
database, which describes only the data of interest to that user.
subset or virtual data