typhoid and malaria

Typhoid and malaria Diagnostics system

LITERATURE REVIEW

1.0 Review of Related Literature

Malaria is a very common disease and a major health problem in the world today, mostly in Africa. Malaria kills twice as many people as previously thought; research has shown that malaria kills about one million, two hundred thousand (1.2 million) people every year which is nearly double the six hundred and fifty-five thousand (655,000) people estimated in 2009 (Guardian, 2012). Malaria is characterized by a fever which is caused by a parasite that is spread by malaria bearing anopheles mosquito, which has become resistant to certain treatments and many insecticides (Wikipedia, 2012).

Typhoid and malaria

Medical diagnostics is based on different methods of research and diseases determination, and their severity with the purpose to aid select and apply necessary treatment, and prevent the development of complications and recurring diseases. Diagnostic procedures involve interaction between the patient and the medical personnel in the form of “question and answer”, good candidate for computerization. Doctors that should treat malaria are expensive, limited in number, and not evenly spread across the globe. Nigeria, for instance, has twenty-eight (28) doctors per ten thousand (10,000) individuals (Africapedia, 2012).

In addition, the people that should access healthcare facilities are far removed from medical facilities. In view of the foregoing, it is of great necessity to provide an expert system to assist doctors in diagnosing and treating malaria, which this research is focused on designing and creating. The aim of this research is, therefore, to build a web based expert system that will diagnose people of malaria, suggest treatment and provide necessary information on malaria, for hospitals and individuals. The software will diagnose malaria infection from information concerning the patient’s symptoms and test results, and recommend a medication, have a database for storing information about drug prescription and other malaria related information, and be user-friendly.

The created system would retain the skill of an expert medical doctor relating to treatment of malaria in case of any eventuality. It would also be useful to people who need self-diagnosis before seeing medical consultants thereby reducing physician’s workload during consultations and easing other problems associated with hospital consultations. In addition, it could act as a diagnosis tool, which can assist malaria researchers determine the intensity or concentration of malaria parasites in designated geographical locations and, in turn, help in developing effective control measures to limit the spread of malaria in such regions. It would also solve the problems encountered in areas where there are no medical experts or where medical experts are limited in number.

Typhoid and malaria
Md. Zillur Rahman, Senior Medical Technologist, examines a blood test for typhoid at the Dhaka Shishu Hospital, in Dhaka, Bangladesh.

1.1 Concept of an Online Typhoid and Malaria Diagnostic System

Intelligence can be defined as the ability to learn and understand, to solve problems and to make decisions. However, Artificial Intelligence (AI) is the display of intelligence by machines, and its main goal as a field is to make machines do things that would require intelligence if done by humans. The central problems of AI include such traits as reasoning, knowledge, planning, learning, communication, perception and the ability to move and manipulate objects. In the early 90s, AI achieved its greatest feats in practical application despite all its previous setbacks. It was successfully applied in logistics, data mining and medical diagnosis. However, AI success started being revived with the commercial success of expert systems. Expert System is a form of Artificial Intelligence program that simulates the knowledge and analytical skills of one or more human experts.

Conventional expert systems are built for a very narrow domain with clearly defined expertise. It makes the system performance fully dependent on the right choice of experts. Although a common strategy is to find just one expert, when a more complex expert system is being built or when expertise is not well defined, multiple experts might be needed. XpertMalTyph is a medical expert system for diagnosis of the complications of malaria and typhoid.

Malaria causes significant morbidity and mortality worldwide. In developing nations, scarce resources lead to inadequate diagnostic procedures. Malaria can result in anemia (a decreased number of red blood cells). The remains of the destroyed red blood cells clump together and cause blockages in the blood vessels.

This can result in brain damage or kidney damage, which is potentially fatal. A particularly serious, potentially life-threatening, form of malaria parasite is called Plasmodium falciparum. In developing countries, limited resources may cause clinical and laboratory misdiagnosis of malaria. Diagnosis of malaria involves identification of malaria parasite or its antigens/products in the blood of the patient. The most commonly used diagnostic test for malaria is microscopy to detect parasites in stained blood films. Thick blood films are used in routine diagnosis and as few as one parasite per 200 μl blood can be detected. The method can be used to differentiate between different parasite species and stages of the life cycle. Although it has good sensitivity and allows species identification and parasite counts, it is time consuming, requires microscopy expertise and maintenance of equipment. Microscopy with fluorescent stains (QBC), dipstick antigen detection of HRP2 and pLDH (Parasight-F, ICT Malaria Pf, OptiMAL), polymerase chain reaction assays and some automated blood cell analyzers offer new approaches with emphasis on clinical relevance and their potential to complement conventional microscopy, especially in countries with imported malaria. People with uncomplicated malaria often have fever, or a recent history of fever, headache, vomiting, watery diarrhea, anemia and an enlarged spleen. In children, additional symptoms include convulsions and cough. Similarly, a bacteria called Salmonella typhi (S. typhi) is responsible for typhoid. S. typhi may be spread by consuming contaminated water, beverages and food, after which the bacteria enter the intestines and then the bloodstream, where they may spread to other body parts. Initial typhoid symptoms include malaise, headache, diarrhea (or constipation), sore throat, fever as high as 1040 F, as well as a rash.

Diagnosis is made by any blood, bone marrow or stool cultures and with the Widal test. In epidemics and less wealthy countries, after excluding malaria, dysentery or pneumonia, a therapeutic trial time with chloramphenicol is generally undertaken while awaiting the results of Widal test and cultures of the blood and stool. However, Rapid and accurate diagnosis of malaria and typhoid is integral to the appropriate treatment of affected individuals and in preventing the further spread of infection in the community. XpertMalTyph developed in this study provides opportunities to patients in developing countries to identify when they have malaria and/or typhoid and seek solutions in the comfort of their homes.

1.2 Theoretical Framework

By the mid-1800s, lab tests had been introduced to detect tuberculosis, cholera, typhoid and diphtheria, but cures for these diseases would not come until later. Physicians also began to study pulse, blood pressure, body temperature and other physiological indicators, even though simple, practical instruments to measure these signs were not developed until the end of the century. The use of precise measurements in diagnosis became standard in medicine in the early 1900s. Standardized eye tests, weight and height tables, and IQ tests were all part of a movement to identify statistical norms of human physiology and behavior. The first hospital lab in Britain, which was set up at Guys Hospital, was organized into clinical wards. Two of these wards were designated for medical student rotations and had a small laboratory attached for clinical work. By 1890, most laboratory procedures in the United States were performed by the physician with a microscope in his home or office. In 1898, Sir William Osler, a Canadian physician, professor, and one of the first well-known authors in the clinical laboratory literature, established ward laboratories at Johns Hopkins Hospital, where routine tests were carried out by attending physicians, and more complex procedures and research problems were referred to the pathology laboratory. An increasing number of useful laboratory tests were discovered in the second half the 1800s, and by the turn of the century, specific chemical and bacteriological tests for disease emerged rapidly. In the 1880s, the organisms responsible for tuberculosis, cholera, typhoid and diphtheria were isolated; and by the mid-1890s, lab tests had been introduced to detect these diseases. The spirochete that causes syphilis was identified in 1905; the Wassermann test for syphilis was introduced in 1906. Advances in the analysis of urine and blood gave physicians additional diagnostic tools.

These innovations were the result of progress in basic science that made it possible to duplicate successful applications more rapidly than ever before. The earlier advances in immunization, such as smallpox vaccination, had been purely empirical discoveries and were not quickly repeated. Microbiology for the first time enabled physicians to link disease-causing organisms, symptoms and lesions systematically. The principles that Pasteur demonstrated in the development of anthrax and rabies vaccines now provided a rational basis for developing vaccines against typhoid, cholera and plague.

1.3 Related Work

Expert system is a branch of applied artificial intelligence (AI) that came to light in the mid-1960s. The basic idea of an expert system is to transfer the expertise of a human domain expert to a computing system, which makes inferences and arrives at specific conclusion as would the human expert. Medical artificial intelligence is primarily concerned with the construction of AI programs that perform diagnosis and make therapy recommendations. Expert systems today are more likely to be found used in clinical laboratories and educational settings, for clinical surveillance, or in data rich areas like the intensive care setting. What is now being realized is that when they fill an appropriate role, intelligent programs do indeed offer significant benefits. One of the most important tasks now facing developers of AI-based systems is to characterize accurately those aspects of medical practice that are best suited to the introduction of artificial intelligence systems. Searle’s strong AI hypothesis states: “The appropriately programmed computer with the right inputs and outputs would thereby have a mind in exactly the same sense human beings have minds”. Turing’s “polite convention” states: “If a machine acts as intelligently as a human being, then it is as intelligent as a human being.

Turing test can be used to describe systems that act like humans. This simple test has become an essential tool in the development and verification of Expert system. General features of intelligent system include coping with uncertainty, data driven reasoning, data/knowledge representation, user interface, and ability of the system to explain the reasoning process that it used to reach a recommendation. General strength of intelligent system includes provision of consistent answers for repetitive decisions, processes and tasks, holding and maintenance of significant levels of information, encouraging organizations to clarify the logic of their decision-making, never “forgetting” to ask a question as a human might, all time availability, and multi-user expert system serving more users at a time. The weaknesses of intelligent transaction system include lack of common sense needed in some decision making and for fast, intuitive decisions, lack of capability for creative responses as human expert would in unusual circumstances, being clear about its logic and reasoning unlike human domain expert, susceptibility to knowledge base errors that would lead to wrong decisions, and rigidity to changing environments, unless knowledge base is changed. The MYCIN Program for Infectious Diseases is one of the earliest medical expert systems to have been developed. It was designed to diagnose and prescribe treatment for infectious diseases particularly spinal meningitis and bacterial infections of the blood. It first decides what bacterium caused the disease and then suggests what antibiotic to give the patient. It is very helpful for physicians that lack expertise at certain diseases because it gives reason for suggesting diagnosis and recommending treatment. The setback of MYCIN is that runs on large time shared systems (slow response), and it is not suitable for the treatment of malaria. XDIS is an expert system that was designed to assist physicians in diagnosis. The system contains information of more than three hundred (300) internal diseases and pathologic syndromes most frequently encountered in general practice. For each set of symptoms entered for a case, the system gets the full list of possible diagnosis ranking from the most probable to the least probable.

The time to work out a diagnosis is usually less than ten (10) minutes. XDIS helps make preliminary diagnosis on the first visit of a patient to the physician and at the same time decides on the necessity of referring the patient to a specialist and to select medical tests to make a more exact diagnosis [3]. Its setback is that it gives probable list of diagnosis, not exact diagnosis. Emerge is another example of a diagnosis rule-based expert system. It was designed to be used in an emergency room only. The system uses a form of production rules which incorporates weighing factors that are determined by a neural network. The neural network is composed of input and output blocks with a hidden layer block in between which communicates input to the output. The neural network learns from examples and then predicts an output based on this knowledge. This system also uses an IFTHEN-UNLESS statement instead of an IF THEN statement so that the decision process may be more precise, the results more accurate, and the explanations better understood. Its setback is that it is difficult to maintain, manage and upgrade since it is not web-based, beside its restriction to emergency room usage.

Your Diagnosis is an online medical diagnosis and symptoms analysis system. It asks several questions about body system and symptoms. Allergies, medications and immunizations are recorded as well as family history and past medical problems. It also does a complex analysis of all information gathered about symptoms and produces a list of all possible and probable medical diagnoses. It is online and can be interacted with in stages. All provided information can be securely stored as confidential personal health record for future retrieval. It also gives a confidential medical report, which could be printed or have emailed for personal usage. The setback of Your Diagnosis is its complexity in trying to diagnose and treat all the ailments in one sweep. GIDEON was developed ten years ago by specialists in infectious diseases and biostatistics, and computer scientists at university-based medical schools in the United States and Israel. GIDEON is a computer program for diagnosis and reference in the fields of tropical and infectious diseases, epidemiology, microbiology and antimicrobial chemotherapy. It was designed to diagnose all the worlds’ infectious diseases based on symptoms, laboratory testing and dermatological profile. It helps in diagnosing infectious diseases, but difficult to maintain, manage and upgrade because it is not web-based. It also attempts to diagnose all infectious diseases which introduced certain complexities.

 

1.4 History of an Online Typhoid and Malaria Diagnostic System

Three distinct periods in the history of medicine are associated with three different places and, therefore, different methods of determining diagnosis: From the middle ages to the 18th century, bedside medicine was prevalent; then between 1794 and 1848 came hospital medicine; and from that time forward, laboratory medicine has served as medicine’s lodestar. The laboratory’s contribution to modern medicine has only recently been recognized by historians as something more than the addition of another resource to medical science and is now being appreciated as the seat of medicine, where clinicians account for what they observe in their patients. The first medical diagnoses made by humans were based on what ancient physicians could observe with their eyes and ears, which sometimes also included the examination of human specimens. The ancient Greeks attributed all disease to disorders of bodily fluids called humors, and during the late medieval period, doctors routinely performed uroscopy. Later, the microscope revealed not only the cellular structure of human tissue, but also the organisms that cause disease. More sophisticated diagnostic tools and techniques—such as the thermometer for measuring temperature and the stethoscope for measuring heart rate—were not in widespread use until the end of the 19th century. The clinical laboratory would not become a standard fixture of medicine until the beginning of the 20th century. This four-part article reviews the history and development of diagnostic methods from ancient to modern times, as well as the evolution of the clinical laboratory from the late 19th century to the present.

1.4.1 Ancient diagnostic methods

In ancient Egypt and Mesopotamia, the earliest physicians made diagnoses and recommended treatments based primarily on observation of clinical symptoms. Palpation and auscultation were also used. Physicians were able to describe dysfunctions of the digestive tract, heart and circulation, the liver and spleen, and menstrual disturbances; unfortunately, this empiric medicine was reserved for royalty and the wealthy. Other less-than-scientific methods of diagnosis used in treating the middle and lower classes included divination through ritual sacrifice to predict the outcome of illness. Usually a sheep would be killed before the statue of a god. Its liver was examined for malformations or peculiarities; the shape of the lobes and the orientation of the common duct were then used to predict the fate of the patient. Ancient physicians also began the practice of examining patient specimens. The oldest known test on body fluids was done on urine in ancient times (before 400 BC). Urine was poured on the ground and observed to see whether it attracted insects. If it did, patients were diagnosed with boils. The ancient Greeks also saw the value in examining body fluids to predict disease. At around 300 BC, Hippocrates promoted the use of the mind and senses as diagnostic tools, a principle that played a large part in his reputation as the “Father of Medicine.” The central Hippocratic doctrine of humoral pathology attributed all disease to disorders of fluids of the body. To obtain a clear picture of disease, Hippocrates advocated a diagnostic protocol that included tasting the patient’s urine, listening to the lungs, and observing skin color and other outward appearances. Beyond that, the physician was to “understand the patient as an individual.” Hippocrates related the appearance of bubbles on the surface of urine specimens to kidney disease and chronic illness. He also related certain urine sediments and blood and pus in urine to disease.

The first description of hematuria, or the presence of blood in urine, by Rufus of Ephesus surfaced at around AD 50 and was attributed to the failure of kidneys to function properly in filtering the blood. Later (c. AD 180), Galen (AD 131–201), who is recognized as the founder of experimental physiology, created a system of pathology that combined Hippocrates’ humoral theories with the Pythagorean theory, which held that the four elements (earth, air, fire and water), corresponded to various combinations of the physiologic qualities of dry, cold, hot and moist. These combinations of physiologic characteristics corresponded roughly to the four humors of the human body: hot + moist = blood; hot + dry = yellow bile; cold + moist = phlegm; and cold + dry = black bile. Galen was known for explaining everything in light of his theory and for having an explanation for everything. He also described diabetes as “diarrhea of urine” and noted the normal relationship between fluid intake and urine volume. His unwavering belief in his own infallibility appealed to complacency and reverence for authority. That dogmatism essentially brought innovation and discovery in European medicine to a standstill for nearly 14 centuries. Anything relating to anatomy, physiology and disease was simply referred back to Galen as the final authority from whom there could be no appeal.

1.4.2 Middle Ages

In medieval Europe, early Christians believed that disease was either punishment for sin or the result of witchcraft or possession. Diagnosis was superfluous. The basic therapy was prayer, penitence, and invocation of saints. Lay medicine based diagnosis on symptoms, examination, pulse, palpitation, percussion, and inspection of excreta and sometimes semen. Diagnosis by “water casting” (uroscopy) was practiced, and the urine flask became the emblem of medieval medicine. By AD 900, Isaac Judaeus, a Jewish physician and philosopher, had devised guidelines for the use of urine as a diagnostic aid; and under the Jerusalem Code of 1090, failure to examine the urine exposed a physician to public beatings. Patients carried their urine to physicians in decorative flasks cradled in wicker baskets and, because urine could be shipped, diagnosis at long distance was common. The first book detailing the color, density, quality and sediment found in urine was written around this time, as well. By around AD 1300, uroscopy became so widespread that it was at the point of near universality in European medicine. Medieval medicine also included interpretation of dreams in its diagnostic repertoire. Repeated dreams of floods indicated “an excess of humors that required evacuation,” and dreams of flight signified “excessive evaporation of humors.”

1.4.3 Seventeenth century

The medical advances of the 17th century consisted mostly of descriptive works of bodily structure and function that laid the groundwork for diagnostic and therapeutic discoveries that followed. The status of medicine was helped along by the introduction of the scientific society in Italy and by the advent of periodical literature. Considered the most momentous event in medical history since Galen’s time, the discovery of the circulation of blood by William Harvey (1578–1657) marked the beginning of a period of mechanical explanations for a variety of functions and processes, including digestion, metabolism, respiration and pregnancy. The English scientist proved through vivisection, ligation and perfusion that the heart acts as a muscular pump propelling the blood throughout the body in a continuous cycle. The invention of the microscope opened the door to the invisible world just as Galileo’s telescope had revealed a vast astronomy. The earliest microscopist was a Jesuit priest, Athanasius Kircher (1602–1680) of Fulda (Germany), who was probably the first to use the microscope to investigate the causes of disease. His experiments showed how maggots and other living creatures developed in decaying matter. Kircher’s writings included an observation that the blood of patients with the plague contained “worms;” however, what he thought to be organisms were probably pus cells and red blood corpuscles because he could not have observed Bacillus pestis with a 32-power microscope.

Robert Hooke (1635–1703) later used the microscope to document the existence of “little boxes,” or cells, in vegetables and inspired the works of later histologists; but some of the greatest contributions to medical science came from Italian microscopist, Marcello Malpighi (1628–1694). Malpighi, who is described as the founder of histology, served as physician to Pope Innocent XII and was famous for his investigations of the embryology of the chick and the histology and physiology of the glands and viscera. His work in embryology describes the minutiae of the aortic arches, the head fold, the neural groove, and the cerebral and optic vesicles. Uroscopy was still in widespread use and had gained popularity as a method to diagnose “chlorosis,” or love-sick young women, and sometimes to test for chastity. Other methods of urinalysis had their roots in the 17th century, as well. The gravimetric analysis of urine was introduced by the Belgian mystic, Jean Baptiste van Helmont (1577–1644). Van Helmont weighed a number of 24-hour specimens, but was unable to draw any valuable conclusions from his measurements. It was not until the late 17th century—when Frederik Dekkers of Leiden, Netherlands, observed in 1694 that urine that contained protein would form a precipitate when boiled with acetic acid—that urinalysis became more scientific and more valuable.

The best qualitative analysis of urine at the time was pioneered by Thomas Willis (1621–1675), an English physician and proponent of chemistry. He was the first to notice the characteristic sweet taste of diabetic urine, which established the principle for the differential diagnosis of diabetes mellitus and diabetes insipidus. Experiments with blood transfusion were also getting underway with the help of a physiologist in Cornwall, England, named Richard Lower (1631–1691). Lower was the first to perform direct transfusion of blood from one animal to another. Other medical innovations of the time included the intravenous injection of drugs, transfusion of blood, and the first attempts to use pulse rate and temperature as indicators of health status.

1.4.4 Eighteenth century

The 18th century is regarded as the “Golden Age” of the successful practitioner, as well as the successful quack. Use of phrenology (the study of the shape of the skull to predict mental faculties and character), magnets, and various powders and potions for treatment of illness were a few of the more popular scams. The advancement of medicine during this time was more theoretical than practical. Internal medicine was improved by new textbooks that cataloged and described many new forms of disease, as well as by the introduction of new drugs, such as digitalis and opium. The state of hospitals in the 18th century, however, was alarming by today’s standards. Recovery from surgical operations was rare because of septicemia. The concept of antisepsis had not yet been discovered, and hospitals were notorious for filth and disease well into the 19th century. One notable event that is a forerunner to the modern practice of laboratory measurement of prothrombin time, plasma thromboplastin time and other coagulation tests, was the discovery of the cause of coagulation. An English physiologist, William Hewson (1739–1774) of Hexham, Northumberland, England, showed that when the coagulation of the blood is delayed, a coagulable plasma can be separated from the corpuscles and skimmed off the surface. Hewson found that plasma contains an insoluble substance that can be precipitated and removed from plasma at a temperature slightly higher than 50°C. Hewson deduced that coagulation was the formation in the plasma of a substance he called “coagulable lymph,” which is now known as fibrinogen. A later discovery that fibrinogen is a plasma protein and that in coagulation it is converted into fibrin attests to the importance of Hewson’s work.

The clinical diagnostic methods of percussion, temperature, heart rate and blood pressure measurements were further refined, and there were some remarkable attempts to employ precision instruments in diagnosis. Leopold Auenbrugger (1722–1809) was the first to use percussion of the chest in diagnosis in 1754 in Vienna. This method involved striking the patient’s chest while the patient holds his or her breath. Auenbrugger proposed that the chest of a healthy person sounds like a cloth-covered drum. A student of Auenbrugger’s, Jean Nicolas Corvisart, a French physician at La Charité in Paris, pioneered the accurate diagnosis of heart and lung diseases using Auenbrugger’s chest thumping technique. Corvisart’s translation of Auenbrugger’s treatise on percussion, “New Invention to Detect by Percussion Hidden Diseases in the Chest,” popularized the practice of thumping on a patient’s chest. The resulting sounds are different when the lungs contain lesions or fluids than in healthy people. This observation was validated by postmortem examination. James Currie (1756–1805), a Scot, was the first to use cold baths in treatment of typhoid fever; and by monitoring the patient’s temperature using a thermometer, he was able to adjust the temperature and frequency of the baths to treat individual patients. It took another hundred years, however, before thermometry became a recognized feature in clinical diagnosis.

In 1707, Sir John Floyer (1649–1734) of Staffordshire, England, introduced the concept of measuring pulse rate by timing pulse beats with a watch. He counted the beats per minute, and tabulated the results; but his work was ignored because of a healthy skepticism for an old Galenic doctrine of there being a special pulse for every disease. The groundbreaking work for measuring blood pressure was done by Stephen Hales (1677–1761), an English clergyman. By fastening a long glass tube inside a horse’s artery, Hales devised the first manometer or tonometer, which he used to make quantitative estimates of the blood pressure, the capacity of the heart and the velocity of blood current. Hales’ work was the precursor for the development of the sphygmomanometer used today to measure arterial blood pressure. Additional advances in urinalysis occurred with J.W. Tichy’s observations of sediments in the urine of febrile patients (1774); Matthew Dobson’s proof that the sweetness of the urine and blood serum in diabetes is caused by sugar (1776); and the development of the yeast test for sugar in diabetic urine by Francis Home (1780).

1.4.5 Nineteenth century

The emergence of sophisticated diagnostic techniques and the laboratories that housed them coincides roughly with the worldwide political, industrial and philosophical revolutions of the 19th century, which transformed societies dominated by religion and aristocracy into those dominated by the industrial, commercial and professional classes. In the decades after the Civil War, American laboratories and their champions were met with a vehement skepticism of science, which was viewed by some as an oppressive tool of capitalist values. The lay public, as well as many practitioners, saw the grounding of medicine in the laboratory as a removal of medical knowledge from the realm of common experience and as a threat to empiricism. Many American physicians who went abroad to Germany and France for supplementary training came back to impart the ideals of European medicine, as well as those of higher education for its own sake, to an American society that found these beliefs threatening. The lab itself was not seen as a threat, but rather the claims it made to authority over medical practice were assailed. The empiricists and the “speculative medical experimentalists” were for the most part divided along generational lines. The older empiricists had a stake in continuing their careers independent of a medical infrastructure or system, while the younger physicians aspired to careers in academic medical centers patterned after German institutions.

The younger, more energetic ranks had to first lobby for such facilities to be built; and as older doctors retired from teaching posts and turned over editorship of scientific journals, this opposition to the lab faded. Medical historians also note that the 19th century was one in which the rest of therapeutics lagged behind, and called it an era of public health. New discoveries in bacteriology allowed for water treatment and pasteurization of milk, which significantly decreased mortality rates. In addition, the advent of antiseptic surgery in the late 19th century reduced the mortality from injuries and operations and increased the range of surgical work. Medical practitioners relied, for a time, more on increased hygiene and less on drugs. Advances in public and personal hygiene had dramatically improved the practice of medicine; predictions were even made that the pharmacopoeia of the time would eventually be reduced to a small fraction of its size. At the beginning of the century, physicians depended primarily on patients’ accounts of symptoms and superficial observation to make diagnoses; manual examination remained relatively unimportant. By the 1850s, a series of new instruments, including the stethoscope, ophthalmoscope and laryngoscope, began to expand the physician’s sensory powers in clinical examination. These instruments helped doctors to move away from a reliance on the patients’ experience of illness and gain a more detached relationship with the appearance and sounds of the patients’ body to make diagnoses. Another wave of diagnostic tools—including the microscope and the X-ray, chemical and bacteriological tests, and machines that generated data on patients’ physiological conditions, such as the spirometer and the electrocardiograph— produced data seemingly independent of the physician’s as well as the patient’s subjective judgment.

These developments had uncertain implications for professional autonomy: They further reduced dependence on the patient, but they increased dependence on capital equipment and formal organization of medical practice. These detached technologies added a highly persuasive rhetoric to the authority of medicine. They also made it possible to move part of the diagnostic process behind the scenes and away from the patient where several physicians could have simultaneous access to the evidence. The stethoscope, for example, could only be used by one person at a time, but lab tests and X-rays enabled several doctors to view and discuss the results. This team approach to diagnosis strengthened the medical community’s claim to objective judgment. Equipped with methods for measuring, quantifying and qualifying, doctors could begin to set standards of human physiology, evaluate deviations, and classify individuals.

 

1.5 Types of an Online Typhoid and Malaria Diagnostic System

  • The MYCIN Program for Infectious Diseases is one of the earliest medical expert systems to have been developed. It was designed to diagnose and prescribe treatment for infectious diseases particularly spinal meningitis and bacterial infections of the blood. It first decides what bacterium caused the disease and then suggests what antibiotic to give the patient. It is very helpful for physicians that lack expertise at certain diseases because it gives reason for suggesting diagnosis and recommending treatment. The setback of MYCIN is that runs on large time shared systems (slow response), and it is not suitable for the treatment of malaria.

 

  • XDIS is an expert system that was designed to assist physicians in diagnosis. The system contains information of more than three hundred (300) internal diseases and pathologic syndromes most frequently encountered in general practice. For each set of symptoms entered for a case, the system gets the full list of possible diagnosis ranking from the most probable to the least probable. The time to work out a diagnosis is usually less than ten (10) minutes. XDIS helps make preliminary diagnosis on the first visit of a patient to the physician and at the same time decides on the necessity of referring the patient to a specialist and to select medical tests to make a more exact diagnosis [3]. Its setback is that it gives probable list of diagnosis, not exact diagnosis.

 

  • Emerge is another example of a diagnosis rule-based expert system. It was designed to be used in an emergency room only. The system uses a form of production rules which incorporates weighing factors that are determined by a neural network. The neural network is composed of input and output blocks with a hidden layer block in between which communicates input to the output. The neural network learns from examples and then predicts an output based on this knowledge. This system also uses an IFTHEN-UNLESS statement instead of an IF THEN statement so that the decision process may be more precise, the results more accurate, and the explanations better understood. Its setback is that it is difficult to maintain, manage and upgrade since it is not web-based, beside its restriction to emergency room usage.

 

  • Your Diagnosis is an online medical diagnosis and symptoms analysis system. It asks several questions about body system and symptoms. Allergies, medications and immunizations are recorded as well as family history and past medical problems. It also does a complex analysis of all information gathered about symptoms and produces a list of all possible and probable medical diagnoses. It is online and can be interacted with in stages. All provided information can be securely stored as confidential personal health record for future retrieval. It also gives a confidential medical report, which could be printed or have emailed for personal usage. The setback of Your Diagnosis is its complexity in trying to diagnose and treat all the ailments in one sweep.

 

 

  • GIDEON was developed ten years ago by specialists in infectious diseases and biostatistics, and computer scientists at university-based medical schools in the United States and Israel. GIDEON is a computer program for diagnosis and reference in the fields of tropical and infectious diseases, epidemiology, microbiology and antimicrobial chemotherapy. It was designed to diagnose all the worlds’ infectious diseases based on symptoms, laboratory testing and dermatological profile. It helps in diagnosing infectious diseases, but difficult to maintain, manage and upgrade because it is not web-based. It also attempts to diagnose all infectious diseases which introduced certain complexities.

1.6 Importance of an Online Typhoid and Malaria Diagnostic System

  • It also attempts to diagnose all infectious diseases which introduced certain complexities.
  • It helps in diagnosing infectious diseases, but difficult to maintain, manage and upgrade because it is not web-based.
  • It also gives a confidential medical report, which could be printed or have emailed for personal usage.
  • It was designed to diagnose all the worlds’ infectious diseases based on symptoms, laboratory testing and dermatological profile.
  • It is online and can be interacted with in stages.
  • All provided information can be securely stored as confidential personal health record for future retrieval.
  • It also does a complex analysis of all information gathered about symptoms and produces a list of all possible and probable medical diagnoses.
  • It asks several questions about body system and symptoms.
  • It is very helpful for physicians that lack expertise at certain diseases because it gives reason for suggesting diagnosis and recommending treatment.
  • It helps make preliminary diagnosis on the first visit of a patient to the physician and at the same time decides on the necessity of referring the patient to a specialist and to select medical tests to make a more exact diagnosis.

 

1.7 Database of an Online Typhoid and Malaria Diagnostic System

Extensive literature review was done on the subject matter. Design and creation research strategy (Oates, 2009) was employed. Necessary expert information was collected using structured interview and through the internet. The waterfall software development model was adopted primarily because this model is simple to understand, implement and it prescribes a systematic approach to software development (Hughes & Cotterell, 2009; Yogi, 2012). The software development environment includes Hypertext Pre-Processor (PHP – scripting language for connection to the database), Structured Query Language (SQL), Hypertext Mark-up Language (HTML – used for the user interface functionality) with Java Script (JS) and Cascading Style Sheet (CSS) to produce an interactive user interface that connects to a database. Unit and integrated system testing of the codes were done. Black box system testing was also performed.

The minimum hardware required to ensure the proper running of the proposed application is: a standard processor, a RAM size of at least 128 MB, a hard disk capacity of at least 10 GB, and an SVGA color monitor. The software required to ensure the proper running of the expert system is: a Windows operating system, Java-script enabled web browser, Apache Server Version 1.0 or higher, MySQL database Version 5.0.51b or higher, PHP Version 5.1.6, and WAMP Package.

Reference

  1. Bettmann OL, Hench PS. A Pictorial History of Medicine. Springfield, IL: Charles C. Thomas; 1956.
  2. Cahan D. The institutional revolution in German physics, 1865–1914. In: Historical Studies in the Physical Sciences. Vol. 15 1984: p. 1–65.
  3. CLMA Ensuring universal access to quality laboratory services: CLMA White Paper. Clinical Laboratory Management Review. 1994;8(3):198–240.
  4. Cunningham A, Williams P. The Laboratory Revolution in Medicine. Cambridge, Great Britain: Cambridge University Press; 1991.
  5. Gantzer ML. The value of urinalysis: An old method continues to prove its worth. Clinical Laboratory News. 1998;12(1):14–16.
  6. Garrison FH. History of Medicine. Philadelphia: W.B. Saunders Co.; 1929.
  7. Marti-Ibañez F. The Epic of Medicine. New York: Clarkson N. Potter, Inc.; 1961.8. Starr P. The Social Transformation of American Medicine. New York: Basic Books, Inc.; 1981.

[1] Guardian, (2012) Malaria deaths Research. Retrieved from: http://www.guardian.co.uk/society/2012/feb/03/malar ia-deaths-research.

[2] Wikipedia, (2012) Malaria. Retrieved from: http://en.wikipedia.org/wiki/Malaria

[3] Africapedia (2012). Doctor to Patient Ratio in Africa. Retrieved from: http://www.africapedia.com/DOCTOR-TOPATIENT-RATIO-IN-AFRICA

[4] Oates, B. J. (2009). Researching Information Systems and Computing. London: SAGE

[5] Hughes, B. and Cotterell, M. (2009). Software Project Management. London: McGraw-Hill Education

[6] Yogi, B. (2012). Software Development Life Cycle. Retrieved from: condor.depaul.edu/jpetlick/extra/394/Session1. ppt

[7] Stair, R. M., & Reynolds, G. W. (2007). Fundamentals of Information Systems. Boston: Course Technology

[8] Nammuni, K., Pickering, C., Modgil, S., Montgomery, A., Hammond, P., Wyatt, J.C., Altman, D.G., Dunlop, R., & Potts, H.W.W. (2004) Design-atrial: a rule-based decision support system for clinical trial design. Knowledge-Based Systems, Vol. 17, Pages 121–129.

[9] OpenClinical (2012). Retrieved from: http://www.openclinical.org/aiinmedicine.html)

[10] Awodele, O. and Omotunde, A. (2011) General Introduction to Artificial Intelligence. Ibadan: FrancoOla. Pp 5-50.

[11] WikiAnswers (2012). The advantages and disadvantages of expert system. Retrieved from: http://wiki.answers.com/Q/What_are_the_advantag es_and_disadvantages_of_expert_system

[12] Buchanan, B.G., and Shortliffe, E.H. (1984). RuleBased Expert Systems: The MYCIN Experiments of the Stanford Heuristic Programming Project. Retrieved from: http://www.amia.org/staff/eshortliffe/BuchananShortliffe-1984/MYCIN%20Book.html

[13] Rodionov, V. (1989). Diagnosis is made by XDIS. Retrieved from: http://mipt.soix.com/xdis.html

[14] Herzner, J. and Kubiska, M. (1992). Expert System Example. Retrieved from: http://www.rpi.edu/dept/chem-eng/BiotechEnviron/EXPERT/expmed.html

[15] YourDiagnosis (2012). Website. Retrieved from: http://yourdiagnosis.com

[16] Gideon (2012) Global Infectious Disease Epidemiology Network. Retrieved from: http://www.gideononline.com/