Monday, February 27, 2012

Bachelor of Arts Degree or a Bachelor of Science Degree?


Ask a number of students and the two top decisions when choosing to study a degree will be which university they will go to and the subject they will study. Little or no student pay particular attention to whether they will be studying a Bachelor of Arts or a Bachelor of Science degree.
What's The Difference?
The difference between the two types of degree is actually relatively small and in most cases is down to the specific university.
Some experts suggest that a Bachelor of Arts degree focuses more on a liberal arts education that may be more use if you are uncertain as to whether you will want to continue in to the related industry. Some Arts degrees also stipulate in their criteria that students are required to complete a number of credits in a foreign language. They tend to be of wider scope and place emphasis on humanities and wide-ranging understanding in a recognized discipline.
Bachelor of Science degrees tend to include more mathematical and scientific courses. Indeed, students may have to include more statistical or research elements in to their degrees and the subject matter may be more focused on application of the methods learned in the degree. As above, students may wish to include a foreign language in their courses but, unlike the Arts degree, it is not mandatory. Students will be expected to take a more focused approach to their studies which will include a mix of liberal arts, technical knowledge, mathematics, research as well as practical skills that may be required whilst working in the field.
Essentially, the type of degree is traditional to the university. It is based on how many credits within the degree are focused on liberal arts courses. An Arts degree must include 75% of the program in the liberal arts; Science degrees must have at least 50%.
Which Degree Is Better?
Whilst one degree is not necessarily better than another, a number of education experts suggest that students who complete a Bachelor of Science degree may have more flexibility and enjoy more career opportunities. The caveat is that in most cases it is preferable for the student to choose the degree that fits best with their interests, skills and career goals.
As students are now taking on more debt than ever to fund their degree, it is essential that students have a clear idea of what they want to do after completing their studies and that their qualification is tailored towards that goal. There is evidence that employers prefer graduates with a Science degree rather than an Arts degree, indeed, students who complete the former often are seen to command a higher salary.
Bachelor of Arts degrees are good options for students who have a strong interest in a particular field but may want to include other disciplines. Plus, a Bachelor of Arts degree may be more useful if the graduate subsequently chooses to change career and enter a different industry.
Bachelor of Science degrees are good for individuals who are certain of their career goals and desire to gain in-depth knowledge of an industry prior to working in the field. It is often for this reason that graduates with a Science degree are seen by employers to be the preferred candidate as they will appear to be more motivated within their chosen field.

Monday, February 20, 2012

Significant Events in the History of Pharmaceutical Testing


In the 10th century, the great Persian medical thinker Ibn Sina proposed what might be called, in today's parlance, pharmaceutical quality assurance guidelines: medicines must be tested on animals before on humans, medicines used in experimental trials must be unadulterated and pure, etc. But the field of pharmaceutical testing didn't really take off until the 20th century, when the industrialized countries saw the need to regulate food and drug quality assurance and quality control.
1906 - The United States Congress passed the Pure Food and Drug Act. This law established labeling requirements for drugs. One small step for man, one giant leap for pharmaceutical quality assurance! But there was still a long way to go.
1908 - Canadian Parliament passed the Proprietary and Patent Medicine Act, which prohibited the use of cocaine in medicines and required drug companies to indicate on the label if heroin, morphine or opium was an ingredient. A small but significant nod to quality assurance and quality control
1937 - The U.S. Congress passes the Food, Drug and Cosmetic Act requiring drugs to submit to pre-market pharmaceutical testing. This milestone in quality assurance and quality control came about as the result of a tragedy known as the Elixir Sulfanilamide Incident, where a new formulation of a drug used to treat strep infections was prescribed to patients without being tested, resulting in the deaths of more than 105 people in fifteen states. For the first time, The Food and Drug Administration concerned itself with quality assurance and quality control for new drugs.
1948 - The Nuremberg Code laid the groundwork for ethical pharmaceutical testing on human participants. The Code specified that human volunteers must be informed and protected from harm - a clear indictment of the practice of conducting experiments on prisoners of war.
1950s and '60s - Various countries, including the United States, Germany and Britain, reacted to the thalidomide baby tragedy by enacting new rules governing the reporting of adverse drug effects and clinical trials. In Canada, the tragedy led to a long overdue requirement for drug manufacturers to prove efficacy before proceeding to market.
'80s - The United States government developed an incentive program for pharmaceutical companies to pursue pharmaceutical testing and development in areas with fewer patients, i.e., to cater to less lucrative but needy markets.
In the same decade, perhaps in response to the HIV crisis, the pharmaceutical quality assurance process was fast tracked for certain drugs intended to treat conditions for which no other drug alternative existed.
1996 - New international standards for drug quality assurance and quality control were spelled out at an International Conference on Harmonization held in Brussels.
As for the future of pharmaceutical quality assurance, no one can say with any certainty what it holds.

Tuesday, February 14, 2012

Quality Assurance and Quality Control Milestones Since 1950-2000


Try to imagine a world without strict quality assurance and quality control guidelines - where packaged foods fail to list all ingredients, where medications contain substances that are harmful to human health. It may conjure up images of peddlers of yore, hawking their tinctures and miracle powders to an unsuspecting public, an image that ceased to be a reality in the 20th century, as such fields as food and pharmaceutical testing took great leaps forward, thanks, in part, to the emergence of three game-changing concepts: Cost of Quality, Zero Defects and Total Quality Management (TQM).
Quality Assurance and Quality Control Concept #1, Cost of Quality, 1950s
This quality assurance and quality control concept goes by several names, including "the price of nonconformance" and "the cost of poor quality." At the time of its emergence, it had one main advocate, Joseph Juran, a Romanian-born American who used statistics to study the human reasons behind organizational errors. Although he worked for a communications company, his theories and methods had far reaching effects, and continue to impact pharmaceutical testing and food quality training today.
Quality Assurance and Quality Control Concept #2, Zero Defects, 1960s
This concept emerged in the aerospace industry in the '60s, but has left an enduring mark on the quality assurance and quality control industry. Its influence can still be felt in pharmaceutical testing and food quality training programs around the world.
Championed by Philip Crosby, a quality control manager for an American missile program, Zero Defects is an approach to quality assurance and quality control that views defects as quite simply unacceptable. According to this principle, there is no excuse for mistakes of any kind. Critics have argued that a certain level of error is unavoidable in any endeavour, to which a proponent of Zero Defects might respond that it would be unacceptable for even one in a million bottles of over-the-counter painkillers to contain a harmful substance. To anticipate error is to set one's sights too low, according to the proponents of Zero Defects.
The intolerance for error makes this concept of particular interest to the pharmaceutical testing and food quality training world.
Quality Assurance and Quality Control Concept #3, Total Quality Management, 1980-
Total Quality Management takes an integrative approach to quality assurance and quality control whereby every person, at every level of an organization, who is involved in any way with bringing a product to market is ultimately responsible for the quality of that product.
This concept, which impacts pharmaceutical testing and food quality training today, is attributed to the combined work of multiple quality assurance and quality control leaders, including the aforementioned Joseph Juran.
How has the field evolved since the turn of the millennium? Some say that there has been a transition away from a focus on manufacturing towards a focus on service - a shift that perhaps reflects some larger changes in North America, where the manufacturing sector has famously fallen on hard times.

Tuesday, February 7, 2012

Heroes in the History of Quality Assurance and Quality Control


When you go to the drugstore to buy allergy medication, you are benefitting from more than a century's worth of development in an unsung field: quality assurance and quality control. When you go to the grocery store to buy a cantaloupe, you are relying on graduates of food quality training to make sure that it was grown and packed with respect for food safety rules. But who are the pioneers behind today's food and pharmaceutical testing?
Quality Assurance and Quality Control Pioneer #1: Frederick Winslow Taylor, the father of science-based management (1856-1915)
Frederick Winslow Taylor was a leader of the Efficiency Movement, an early 20th-century movement that aimed to reduce waste in industry and society. He was so influential within this movement - the echoes of which are still felt in pharmaceutical testing and food quality training today - that it is sometimes known as Taylorism.
Taylor was born to a Quaker family in Pennsylvania. He started his working life as a machinist. In his time on the shop floor, he realized that his fellow workers were not working to their fullest capacity. This sparked his interest in the concept of productivity. He went on to promote the application of scientific principles to industrial management, a legacy that endures today in practices that guide food and pharmaceutical testing.
Quality Assurance and Quality Control Pioneer #2: Walter Shewhart, the father of statistical quality control (1891-1967)
This Illinois-born physicist worked in the Inspection Engineering Department of the Western Electric Company. When he joined, quality assurance and control was focused exclusively on inspecting the end product. He introduced a new goal - trying to minimize defects during the manufacturing process - which is still a tenet of pharmaceutical testing and food quality training today.
Quality Assurance and Quality Control Pioneer #3: William Edwards Deming, the father of the quality evolution (1900-1993)
World War II was integral to the industry movement, and Iowa-born statistician W. Edwards Deming was a leading figure of the time. A meeting with Walter Shewhart inspired him to consider the application of statistics. The resulting theories are said to have transformed how industry operates. He is still recognized as a hero in Japan for his post-war work there improving quality in the manufacturing industry.
Quality Assurance and Quality Control Pioneer #4: Joseph Duran, evangelist for quality (1904-2008)
In 1925, Romanian-born, American-raised Juran received training in statistical sampling from the Bell laboratory where he was employed. He would go on to develop a theory that a resistance to new ideas was often a cause of quality assurance and quality control problems.
The next time you are in the drugstore or grocery store, take a moment to reflect on the pioneers behind food and pharmaceutical testing.

Wednesday, February 1, 2012

Crusading Chemist Harvey Wiley Paved Way for Pharmaceutical Testing


Before America had its Food and Drug Administration, there was the United States Department of Agriculture's Bureau of Chemistry, which is still famous today for Dr. Wiley's Poison Squad, a dedicated group of staff members who, at the turn of the last century, bravely volunteered to ingest potentially toxic preservatives to firmly establish the risk they posed to public health, drawing international attention to the need for better food and pharmaceutical testing.
The concern these brave pioneers of food and pharmaceutical testing showed their fellow citizens was rewarded with a song, the lyrics of which survive today:
"O we're the merries heard of hulks
that ever the world has seen;
We don't shy off from your rough
on rats or even from Paris green:
We're on the hund for a toxic dope
That's certain to kill, sans fail."
Although his experimental approach to food and pharmaceutical quality assurance control may seem unorthodox by today's standard, Dr. Wiley's legacy is undeniable. How did the Indiana-born son of a farmer come to play such a revolutionary role in the history of pharmaceutical testing?
Pharmaceutical testing pioneer Dr. Wiley's early years
Dr. Wiley was born Harvey Washington Wiley in October 1844. He fought for the Union Army in the American Civil War. After the war, this Renaissance man took degrees in the humanities, medicine and science (at Harvard, no less). How did this lead him to a career as a trailblazer in pharmaceutical quality assurance control?
His interest in what would now be termed quality assurance and quality control may have begun when Indiana asked him to return home to analyze syrups and other sugars for signs of tampering or misrepresentation. There were many food scandals in the end of the 19th century. It was reportedly quite common for a bottle labeled as pure maple syrup to actually contain a substance more akin to corn syrup, for example.
It wasn't very long before the federal government lured him to Washington with a position as Chief Chemist in the Department of Agriculture's Bureau of Chemistry, where he would fully embrace his role as a crusader for food and pharmaceutical testing, forming the so-called "poison squad": willing human guinea pigs who tested the effects of certain preservatives on humans. The goal? To establish "whether preservatives should ever be used or not, and if so, what preservatives and in what quantities."
His tests on the preservatives in foods are credited with leading to the drafting of one of the first pieces of food and pharmaceutical quality assurance legislation, the Pure Food and Drug Act, which was enacted in 1906. The field hasn't stopped evolving since!
Today's food and pharmaceutical testing practices owe an enormous debt to Dr. Wiley and his Poison Squad. The doctor and his volunteers drew America's attention to the need for quality assurance and quality control for ingestible substances.