Friday, January 31, 2020

Technology vs Man Essay Example for Free

Technology vs Man Essay Technology versus man is not only a theme found in literature, such as when the scientist, Frankenstein, created the monster who came alive and turned on the scientist, but is also a theme found in the real-life world of American economy. The Economist is a weekly newspaper focusing on politics and business news and opinion. It ran an article called â€Å"Into the Unknown† which put forth the idea that changes in technology that destroy jobs can also create new ones. The machine (created by man) will not necessarily turn on the man to destroy him by taking away his means of making a living. Though the machine may eliminate one means of making a living, in so doing it may create a number of new means. When technology starts to eliminate jobs, it also creates an opportunity to profit from the creation of new jobs. â€Å"Into the Unknown† says the fear that a rise in technology would cause a decline in jobs is not a new one. In 1929 American economist Stuart Chase in his book Men and Machines, made the prediction that the creation of machines to do the work that man once did would soon destroy the American economy. The machines would go on producing the same amount of product, but jobless people would not have the money to buy the product. He felt that this economic disaster was just around the corner. Time has proven him wrong according to this article. What Chase didn’t understand was that the machine that destroyed one job set the course for the creation of new, possibly unthought-of jobs. Economic predictions are often wrong and short-sighted. Even short-term labor-market predictions can be wrong as seen in the 1988 example of the twenty occupations that the government predicted would suffer the most job losses between 1988 and 2000. Half of those occupations gained jobs instead of lost. The fear of out-sourcing jobs to other countries is another modern day economic fear according this article; but the author feels out-sourcing could be a way of getting rid of less valuable jobs and then using those workers to do more valuable jobs. Retraining workers and remembering that human desire for new technology will help keep Americans working. The idea that changes in technology that destroy jobs can create new ones is not accepted by everyone. Two hundred years ago in England a group called the Luddites – 19th-century English textile workers – revolted against industrialization by sabotaging mechanical looms. They felt these looms made it possible to replace them with less-skilled, low-wage workers, leaving them without work. An English economist of that day, David Ricardo, was the first to predict that technology would result in unemployment. There have been many others that have agreed with that prediction – economists such as John Maynard, Wassily Leontief, Pater Drucker, and Stuart Chase. Yet despite massive mechanization and automation, the U.S. economy has kept creating jobs. These fears are still being advanced today by people such as Massachusetts Institute of Technology professors, Erik Brynjolfsson and Andrew P. McAfee. If Stuart Chase thought economic disaster was just around the corner, these men believe we have turned that corner. They believe automation is replacing people faster than the economy can create jobs. Catherine Mann, who is quoted in The Economist article, disagrees with this assessment. She says that Information Technology (IT) jobs have risen from 1999-2003 (Behrens 246). IT jobs, such as health information experts, machine-to-machine communications enablers, and outsourcing/offshoring managers, are increasing. Ms. Mann actually predicts an IT labor shortage (Behrens 246). Is America on the brink of economic disaster due to IT replacing men with machines, or is IT actually producing more jobs than it is taking away as the theme states? History would cause one to believe that the economy will adjust as it always has, and new jobs will be generated; yet a new variable could prove that wrong. America’s gross domestic product has grown 75% since 2009, yet unemployment has hovered above 9% since the same date. This would indicate that Stuart Chase and his kind are right. The new variable is Moore’s Law – microprocessors double their performance every eighteen months. They have been doing this ever since they were introduced in 1958. To illustrate this growth, if one grain of rice was placed on a chessboard square and then doubled on the second square and then that amount was doubled on the third square, by the time all sixty-four squares were filled, the amount of rice would be equal to Mount Everest. Simply put, computers have grown far more powerful over the past fifty years. An example of this is the technological advancement in pattern recognition which now surpasses human capability. This is seen in autonomous vehicles and voice recognition software. Over time, a well-functioning economy should adjust to technological unemployment, but it’s important that workers learn new skills and new business models be invented. As the article states, computer professionals have learned that maintaining standard business-software packages is no longer lucrative, but tailoring business software and services is. There is not a big supply of IT graduates to recruit and train in America. Therefore, companies have to retrain their employees in these sought-after skills (Behrens 246). When technology starts to eliminate jobs, it also creates an opportunity to profit from the creation of new jobs. Even though it is a possibility that we have actually turned the corner in our economy and that technology is actually eliminating jobs faster than they can be created, it is not the time to throw hands in the air and give up. It would be good to remember that technology has created jobs today that would not have been dreamed of twenty-five to fifty years ago. Who knows what jobs will be available twenty-five to fifty years from now?

Thursday, January 23, 2020

Vaccination: A Necessary Precaution Essay -- Medicine Vaccines

The issue regarding vaccinations and their accompanying side effects has been a prevalent debate throughout society and medicine for a number of years. Some continue to believe that vaccinations are harmful and actually promote disease, but the truth is that the concept of immunization is one of the most significant advances in scientific history that has led to the prevention of countless diseases and epidemics throughout the world. Still, despite the overall improvement of public health, the usage of vaccinations remains a controversial concern that is constantly challenged. Vaccination critics argue that the serious side effects associated with vaccines have been underreported, underfunded, and rarely researched. This, however, is false. Vaccinations are a necessary part of society because they prevent the spread of major diseases, reduce the severity of illness, boost one’s immune system, and in turn, protect the populous from potential epidemics. The definition of a vaccination, as stated by the Encyclopedia Britannica, is â€Å"a suspension of weakened, killed, or fragmented microorganisms or toxins or of antibodies or lymphocytes that is administered primarily to prevent disease† (Encyclopedia Britannica). Vaccines stimulate the immune system to attack the specific harmful agent and then cause the anti-bodies to remain sensitized in case the agent should ever reappear in one’s system. Obviously, this can be helpful when trying to prevent disease, or any other illness for that matter, since the anti-bodies specific to that type of illness remain present in one’s body lest the illness returns. Since infants are extremely susceptible to infirmity, many are vaccinated as early as the first month of their life. This helps p... ...ubmed/11032190>. Encyclopedia Britannica. "Science & Technology::Vaccine." Encyclopedia Britannica. Encyclopedia Britannica, Web. . Fisher, Barbara Loe. "In the Wake of Vaccines." Mothering.com 126 (2004): n. pag. Web. 30 Nov 2010. . Narins, Brigham. World of Microbiology and Immunology. 1. Farmington Mills, MI: Gale, 2003. Web. . Riedel, Stefan. "Edward Jenner and the History of Smallpox and Vaccination." Baylor University Medical Center Proceedings 18.1 (2005): 21-25. Web. 1 Dec 2010. . Williams, Tony. "God's Will and Dead Viruses." Internet Review of Books Sep. 2010: 304. Web. 30 Nov 2010. .

Wednesday, January 15, 2020

Declaratory Theory

â€Å"Declaratory theory is propounded on the belief that judges' decisions never make law, rather they only constitute evidence of what the law is. However, this view is no longer accepted. There are three reasons for the persistence of the declaratory theory. In the first place, it appealed in the separation of powers. Secondly, it concealed the fact that judge-made law is retrospective in its effect and finally, when the judges confronted with a new, unusual, or different point, they tend to present as if the answer is provided by the common law.One of the most widely-accepted principles of the English legal system is what is known as the ‘declaratory theory' of judicial decision-making. This principle states that when judges are required to make decisions, they do not create or change the law, they merely ‘declare' it. That is, a judge says what he or she finds the law to be; no ‘new' law is ever created by judges. New law comes from Parliament. For example, th e Criminal Justice Bill that is currently going through Parliament will make fairly radical changes to the criminal law.It will take away the blanket immunity that currently exists from being prosecuted twice for the same offence. No-one is suggesting that this Bill declares the law: the ancient ‘double-jeopardy' principle has existed for centuries. When the Bill is enacted, the law will simply change. This article attempts to show, first, that the declaratory theory itself is based on indefensible assumptions of fact. Second, it shows that the theory sometimes leads to bizarre conclusions, which can only be avoided by the most strained reasoning.Finally, it examines why the theory commands so much reverence, when most academics and many judges believe it to be fatally flawed. Why the declaratory theory is factually indefensible The classical exposition of the declaratory theory is that of Lord Esher in Willis v Baddeley (1892): There is, in fact, no such thing as judge-made l aw, for the judges do not make the law, though they frequently have to apply existing law to circumstances as to which it has not previously been authoritatively laid down that such law is applicable.That judges appear to create and change law is undeniable; cases like Donaghue v Stevenson, Hedley Byrne v Heller, and Wednesbury represent significant developments in the law. In Lord Esher's view, the judges in these cases would simply be applying existing principles to new fact situations. But where do these existing principles come from? Some of them, no doubt, come from previous case law. When a judge is called on to decide a case, most often a decision can be made by looking at previous cases whose facts are similar to those at issue, and reasoning from them.Very often there will be previous cases that are binding on a particular court, and these will dictate the outcome. But unless we are to accept an infinite regress of case law, back to the very dawn of time, there must be some point in the past at which an issue was first decided. The romantic view is that the earliest judicial decisions were made by the ‘wandering justices' of the 13th century, who travelled the land at the King's behest, applying and unifying the existing law of the land.The pragmatic view is that the English common law results from an attempt by the Norman French nobility to apply its standards of law in a conquered country, while giving an illusion of continuity. Whether the legal developments of the medieval period followed from a process of approving established legal custom, or from the imposition of a foreign jurisprudence, neither represent an answer to the question where the foundational principles come from. There are really only two possibilities: either they were, at some point, created by the judges, or they were based on existing ‘universal truths' that were self-evident to the judges.The declaratory theory repudiates the notion that the judges ‘made thin gs up', so the only alternative is that they were based on universal truths. The notion that law is based on fundamental, self-evident principles of ethics is often called ‘natural law' jurisprudence. To be fair, the idea of ‘natural law' has had a bit of a revival in the last fifty years or so, after being out of favour since the 18th century. The idea that the declaratory theory can be traced back to natural law therefore does not attract the same scepticism today as it would have in the 19th century.The problem with natural law is that even if one is prepared to accept its basic tenet, that there indeed are self-evident principles of ethics, it is by no means obvious that every situation that requires a judicial decision is one in which such fundamentals are at issue. Consider, for example, the well-known case of Entores v Miles Far East Corp (1955). This concerned the formation of a contract by telex machine, in the very early days of this technology.Previously most formal business transactions would have been carried out by post; the ‘postal rule' was – and still is – that if person A offers to contract with person B, then the contract is formed when B's letter of acceptance is posted to A. This is the case even if B's acceptance never even reaches A. When considering the use of telex, the court had to decide whether the same principle could be applied to telex as to post, that is, whether a telexed acceptance was effective on sending, or on receipt. The leading judgement in Entores was given by Denning LJ.In his judgement he does not refer to any existing case law, or any legal principle. Instead, he says that it is simply reasonable and obvious that a telex must be received to be effective. If the declarative theory is correct, then Denning's judgement cannot be creating law: it must be declaring what the law is. But since he does not refer to any existing law, it must, presumably, be derived from universal principles. No w, a proponent of natural law may believe it is self-evident that, for example, murder and rape are wrong.But it takes a real leap of faith to believe that there are principles of natural law at stake in deciding when a telexed contract is formed. The reality, of course, is that when Entores was heard, no-one really wanted to see the ‘postal rule' extended to a new technology. Denning's judgement is an entirely pragmatic one. It does not require any higher principles to be considered. In summary, the declaratory theory is predicated absolutely on acceptance of a natural law view of jurisprudence, not just for fundamental principles of ethics, but for everything.This, I suggest, is just too much to swallow. Why the declaratory theory produces bizarre results Law students generally know about the ‘retrospectivity of the declaratory theory'; but it doesn't seem to be well understood that this is not a doctrinal matter, or something that can be argued either way, it is an in evitable conclusion of the declaratory theory. If a judicial decision cannot create new law, then when the judge declares the law, as a matter of plain logic he is declaring what the law always was. In the Entores example discussed above, this does not create a problem.It established that the use of telex had certain legal consequences, but since telex was only just coming into use when this decision was made, the fact that Denning was declaring what the law was is of no consequence. It is purely a matter of academic discussion whether the ‘postal rule' would have applied to telex in, say, the 15th century. It is, surely, of not practical consequence. Perhaps the first occasion on which the full implications of the declaratory theory had to be confronted squarely by a court was in the case of Kleinwort Benson v Leicester CC.Here, the House of lords had to rule on what should have been, for a court of this standing, a routine matter. The question at issue was whether money was recoverable in a restitution action, if it was paid from one party to another in a mistaken understanding of law. It had always been the case that money paid under of a misunderstanding of fact was recoverable. It was widely believed that the inability to reclaim money paid under a mistake of law was unjust, and incompatible with other legal principles and other jurisdictions.Both parties to the case, and all five of the law lords, were in agreement on this point: it should be possible to recover money paid under a mistake of law. The disagreement was on whether the decision that it was recoverable should apply only to new cases, or to past cases. Kleinwort Benson, a bank, had already paid its money to the defendant local authority. It therefore argued that the decision should operate retrospectively, so it could reclaim its money. The Local Authority, on the other hand, argued that the decision should not have retrospective effect.The problem was that if the issue were decided in f avour of the claimant bank, it must have retrospective effect. This is a direct consequence of the declarative theory. After all, if the law at time T1 was X, and it is later changed at time T2 by judicial ‘declaration' to Y, then the effect of that declaration is to deem that the law at T1 was Y as well. Of course, no-one at time T1 knew this, and so a decision made on the basis that the law was X, not Y, was necessarily mistaken. You may be wondering why this would have such dramatic consequences.Well, a potentially large number of businesses could suddenly find that the they had grounds for litigation arising from things that happened in the distant past, and which they had no way of knowing at the time would be actionable. No-one would wish to see a barrage of ancient, poorly-remembered cases dragged up before the courts in the hope of gain. For technical reasons which I don't have space to explain here, the Limitations Act would not prevent this. So the Law Lords were fac ed with a problem.They could decide justly, in favour of the claimant bank, by ruling that it could recover its money, and accept the inevitable problems that the retrospectivity of its decision would bring. Or it could decide against the claimant, and avoid the problems, but at the expense of leaving in place an unjust and criticised rule of law. It was simply not open to the judges to change the unjust law, without the change being retrospective, unless they were prepared to openly attack the declarative theory. It is interesting to see how the various judges attempted to deal with this problem.It should be noted from the outset that all the Law Lords in Kleinwort Benson agreed that, in practice, judicial decisions do change the law, rather than simply declaring it. No-one suggested for a moment that the declaratory theory was actually true. For example, Lord Goff says: It is universally recognised that judicial development of the common law is inevitable. If it had never taken pl ace, the common law would be the same now as it was in the reign of King Henry II†¦ However, there was very little enthusiasm for making an official pronouncement to that effect.We will discuss possible reasons for this later. Lord Browne-Wilkinson proposed a judicial damage-limitation exercise. He suggested that although the declaratory theory should be upheld, it could be prevented from giving rise to actions arising out of past conduct. †¦ retrospection cannot falsify history: if at the date of each payment it was settled law†¦ [the claimants] were not labouring under any mistake of law at that date. The subsequent decision †¦ could not create a mistake where no mistake existed at the time.In other words, what he seems to be saying is that although the claimants did in fact err in law, they had not made a mistake of law, so they could not reclaim their payments. This is quite a neat trick, because it upholds the revered declaratory theory, while preventing it giving rise to an undesirable situation. However, it does rely on accepting that there are two different metas of ‘mistake of law'. One meta occurs when a person misunderstands the law that actually subsists at the time he applies it, and which continues to subsist.The other meta occurs when a person correctly understands the law at the time he made the decision, but his understanding was later made wrong by a judicial decision. Even if one accepts this arbitrary and unfounded distinction, it seems impossible to avoid the conclusion that it is unjust. If a person makes a mistake of law, and the law remains the same, then the mistaken person can reclaim any money paid as a result of that mistake. On the other hand, a person who later finds that he was mistaken as a result of judicial decision cannot reclaim anything.Yet the latter person is blameless: his decision has been ‘wronged' by later events beyond his control. The former person could at least (in theory) have disc overed what the law was. The effect of the Browne-Wilkinson solution is to leave the declaratory theory intact, at the expense of justice and common sense. Lord Goff showed, perhaps, the greatest reverence for the declaratory theory: I can see no good reason why your Lordships' House should take a step which, as I see it, is inconsistent with the declaratory theory of judicial decision as applied in our legal system†¦As a result, he was prepared to allow a person to recover money paid under a decision in law which was correct at the time, and later shown to be false. In his analysis, the claimant was labouring under a mistake of law, but simply did not know it. Lord Goff correctly analysed the effect of the retrospectivity of the declaratory theory, and allowed it to stand despite the odd results it engenders. Lord Hoffman recognised the problems that would follow from finding for the claimant, but decided that they were a price worth paying for doing justice in the particular case: This may suggest that your Lordships should leave the whole question†¦ o the legislature†¦ There is obviously a strong argument for doing so, but I do not think that it should prevail over the desirability of giving in this case what your Lordships consider to be a just and principled decision. Lord Hope decided along much the same lines as Lord Goff. Of the five Law Lords, Lord Lloyd was the only one to criticise the declarative theory: It follows that†¦ the House of lords is doing more than develop the law. It is changing the law, as common sense suggests†¦ If this view of what happens is inconsistent with the declaratory theory of the court's function, then it is time we said so.It always was a fairy tale. And: For myself, I would want to allow the appeal, if I could, [avoiding the effect of retrospectivity]. But as that is not to be, I consider the second best course is to leave the abolition of the mistake of law rule to Parliament. He seems to be sayi ng that a decision for the claimant, coupled with the effect of the declaratory theory, will produce results so bizarre and unpredictable that it ought not to be allowed. In other words, the price of doing justice in this case is too high.Legal retrospectivity is bad enough in the civil law, but in the criminal law it becomes a human rights issue. Article 7(1) of the European Convention on Human Rights specifically forbids criminal sanctions for an act that did not constitute a crime at the time it was committed. In other words, however heinous we might think an act is, it can't be punished unless the offender had a way to know it was illegal. Of course, ‘ignorance of the law is no defence', but the offender has to be able to know the law to be bound by it. Consider the famous House of lords case of R v R (1994).This concerned a man who raped his wife, and based his defence on the fact that for a man to rape his wife was not, in fact, illegal. It may be condemned, it may even be wicked, but it was not – at that time – illegal. If a man had approach a solicitor in 1990 and said ‘Look, I'm thinking of raping my wife, is that illegal? ‘ a competent solicitor may well have said: ‘Well, of course I wouldn't condone it, but the balance of authority is that it isn't actually illegal'. He could have cited authorities going back to the 16th century to back this up.At this time, there was increasing pressure on Parliament and the courts to overturn this unedifying principle of law, but when R was heard, no action had been taken. To cut a long story short, the House of lords decided that marital rape was illegal, reversing a 400-year tradition. Everyone, with the exception of the defendant, heaved a sigh of relief. Later that year, the decision was put on a statutory basis, which appeared to settle the matter once and for all. The fly in the ointment is our old friend retrospectivity. The decision in R was not that marital rape was i llegal, but that it had always been illegal.Again, the court had no power to decide otherwise. And this means that an octogenarian who raped his wife in the 1940's could now be prosecuted. You may feel that this is a just conclusion; you may feel that rapists should get their just deserts. However, the fact remains that we would be punishing a person for something which was not illegal at the time, and which he would have no way of knowing was ever going to be illegal. The social conditions of the time may not even have led our hypothetical defendant to think he was doing anything wrong.But he could still be prosecuted. This may sound far-fetched, but in fact within a year of the decision in R, cases were being heard in the European Court of Human Rights (ECHR). SW v United Kingdom (1995) concerned a man who was prosecuted in 1994 for a rape he had allegedly committed in 1990. If was far from obvious that marital rape was illegal in 1990. The ECHR upheld the criminal conviction, on the basis that when the rapes occurred, the defendants could have reasonably foreseen that the criminalisation of martial rape was likely.The problem with the decision in SW v UK is that it suggests that a person must govern his behaviour, not by what the law is, but by what he predicts it will be when any consequent prosecution is bought. So, not only is ignorance of the law no defence, but ignorance of the future development of the law is also no defence! None of the forgoing is intended to condone the practice of marital rape. Judicial retrospectivity presents the same kind of problem for any criminal offence, of any severity. Lord Diplock has suggested that the retrospectivity of judicial decisions discourages judges from correcting defects in the law.Judges have to be very conservative if they must predict not only the effect of their decisions on new cases, but the effect they would have had if made in the past. To get around this problem, the Supreme Court of the USA has adop ted the device of ‘prospective overruling'; this device allows the court to state that a decision that changes the law is not to have retrospective effect. The problem is that prospective overruling is simply incompatible with the declaratory theory. If the former comes in, the latter must go. However, as Prof.Zander says, the courts can accept that the declaratory, retrospective effect of its decisions is doctrinally ‘correct', while at the same time letting it be known that they will decide cases on the basis of the law as would have been understood when the events occurred, not when the case is heard. This is a fudge, but probably a workable fudge. Why is the declaratory theory so revered? In Albion's Fatal Tree (1975), Douglas Hay argues that the decline in formal religious observance in the 18th century left a power vacuum to be filled by the law.For law to command the respect of society in the way that the church had done, it was necessary that it be seen as someth ing above and beyond its practitioners: The punctilious attention to forms, the dispassionate and legalistic exchanges between counsel and the judge, argued that those administering the laws submitted to its rules†¦ In short, it's very inefficiency, its absurd formalism, was part of its strength as ideology. Such an ideology would be undermined, of course, if it were seen that law were nothing more than the creation of ordinary people.It was the job of the legal profession to form an elite, and thereby shield the ugly reality of lawmaking from public scrutiny. While this argument may have had validity in the 18th century, it is not at all easy to see that it stands up in the 21st century. To respect the law, we don't necessarily need to view it as having supernatural origins. Moreover, since the 18th century the development of the law has increasingly been effected by statute. No-one expects Parliament's legislative programme to be to be guided by anything more than the views o f society as expressed through the ballot box.Nevertheless, while most judges tacitly accept that their activities have the effect of lawmaking, relatively few have been prepared to criticise the declaratory theory in public. Lord Reid is usually credited with first describing the declaratory theory as a ‘fairy tale'; in a 1972 article ‘The judge as law-maker' in JSPTL he described the ‘Aladdin's cave' in which ‘those with a taste for fairy tales' expect the common law to be found. However, he was not the first influential judge to cast doubt on the declaratory theory. For example, Lord Radcliffe wrote in the Law Society Gazette in 1964 †¦ here was never a more sterile controversy than that upon the question whether a judge makes law. Of course he does. How can he help it? Such comments are, to say the least, unusual. Prof. Atiyah is probably the most outspoken critic of the modern judicial attitude to the declaratory theory. In Judges and Policy ([1980] ILR 346) he identified five reasons for its continued existence. First, it is to the advantage of the judge if he can, in a difficult case, deflect any criticism of his own decision onto ‘the law' as a higher principle.As Atiyah says, of course, this can be seen as a ‘shabby attempt to evade responsibility'. Nonetheless, the job of a judge is difficult enough, without having to deal with personal attacks on his decisions. Lord Devlin has suggested that judges will occasionally hint to claimants that they wish they could find otherwise, but are bound by ‘the law'. Second, it is generally accepted as a constitutional principle that it is the role of the legislature to make law, and the role of the judiciary to interpret it in specific cases. Where judges do make law, they should do so within narrow constraints.There is undoubtedly some virtue in this principle. The most famous exponent of judicial creativity in modern times is almost certainly Lord Denning. His view was very much that it was the job of the judge to ‘do justice'; if that meant that principles of law had to be bent to fit, that was a price worth paying. The problem is that his decisions do not generalise. It is often difficult for later judges, reading his reasoning, to determine whether the decisions he made are based on law that ought to be applicable in other cases, or to fact situations particular to the case under consideration.This is evidenced by the fact that many of the principles that he established by doing the right thing in a particular case have come to be misapplied in later cases, and have had to be circumscribed by later judges. For example, his decision in Solle v Butcher (1949) that a contract could be set aside on ‘equitable grounds' when entered under a mutual mistake, did justice in the case itself. This decision was followed in a large number of cases, but it was never entirely clear what would amount to ‘equitable grounds'.Finally, in 20 03 the case of The Great Peace more or less demolished the entire concept of ‘mistake in equity' and put this branch of law back where it was 50 years ago. Even if judicial creativity can do justice in the present case without compromising later decisions, there are other reasons why judicial creativity should be constrained. Judges are only able to deal with cases they hear; it is difficult for them to take a wider view of any issue. Judges are not well-placed to make decisions that involve elements of social policy.In addition, arguably judges are drawn from a much narrower section of society than MPs, and therefore less representative. Third, Atiyah argues that judicial lawmaking is tolerated only because it is not exercised openly. Lord Devlin has argued (Judges and lawmakers [1976] 39 MLR 11) that if the courts are given, or arrogate to themselves, the power to make decisions without retrospective effect (and thereby demolish the declarative theory) this will amount to an approval to engage in judicial law-making in the large.While we accept that development of the law requires an occasional exercise of judicial creativity, the fact that it has to be done on the sly means that it won't be done all that often: Paddling across the Rubicon by individuals in disguise†¦ is better than the bridging of the river by an army in uniform with bands playing. Atiyah's fourth argument is that many judges themselves have a naive and simplistic view of their own lawmaking role.They frequently speak or write as though the only alternative to a slavish devotion to the declaratory theory is the wholesale abandonment of the doctrine of precedent and the separation of powers. Judges frequently invoke Seldon's old chestnut about the law varying with the length of the Lord Chancellor's foot as a reason for their own conservatism. However, there is no reason to assume that a disavowal of the declaratory theory need signal the end of the doctrine of precedent (it has n ot done so in the USA), or the dissolution of the separation of powers.The fifth argument is that public respect for the judiciary depends on their strict and evident impartiality. If the judge was seen to create or change law, the implication is that the judge prefers one view of law to another. But, as Atiyah says, there is no reason to believe that the public will respect a judge that is impartial but unjust, more than one that is partial but fair. Judicial adherence, at least in public, to the declaratory theory may be for the very best of motives.However, in a well-educated, democratic society, it is doubtful whether it is ever appropriate for the governing classes to espouse one point of view in public, and a different one in private. Not only is it intellectually dishonest, it is doubtful whether it is necessary. Moreover, it is a strategy that is unlikely to work for much longer. It seems unlikely that the public will be moved to increased confidence in the judiciary, when i t becomes obvious that the judiciary have practised a paternalistic and patronising form of misinformation for all these years. â€Å"

Tuesday, January 7, 2020

Should Tablets Become The New Primary Way Students Learn

Should Tablets Become the New Primary Way Students learn? With each passing year, school systems strive to become the best/top school in their district by developing new ways to help further educate young minds and improving over all testing scores. To achieve their goals, some schools has cut down on recess and increased class room proactivity. Other schools just simply took out your basic electives such as home EC and work shop and replaced them with more math classes and science class. Then you have some schools where the newest technology; such as tablets, is the next best thing in to teaching your children. Even though finding the best solution for teaching our children better in school is always the best thing we can do for them, bringing in tablets for children may help them learn how to use technology better, but it will in reality decrease their chances of actually learning to their full content. Tablets should not be allowed to become the primary way students learn. Learning how to use the newest technology is always the useful way of adapting along with how our current century is leading into. Technology is something that is growing rapidly and is used a lot more frequently then ever before. It has improved the way we do things such as communicating with each other, searching for information has became a lot more accessible through the use of online, and even our transportation as evolved from the use of the internet by creating things such as smart cars. ButShow MoreRelatedIpads in the Primary Classroom1042 Words   |  4 PagesThe introduction of the iPad/Tablet in Schools In 2010, apple introduced it’s newest technology, the iPad, which promised to bring mobile technology into every home and classroom. With the introduction of the iPad comes new considerations for learning and pedagogy. (Sheppard, 2011) McKenna (2012) suggests that the internet is as common a school fixture as lockers and library books. Additionally schoolwork is one of the most common activities performed online. One of the motivators for schools toRead MoreStudents Analyze Emerging Technologies Designed Reduce Barriers For Learning And Encourage Both Glocal ( Global And Local ) Perspectives1667 Words   |  7 PagesStudents analyze emerging technologies designed to reduce barriers to learning and encourage both glocal (global and local) perspectives. Students will identify two or three emerging educational technologies and discuss the following for each one: Introduction As an educator, one might acknowledge that there are several technologies that are asserting their way into mainstream education. Since the emergence of technology use in the classroom, learning technology can be found in every disciplineRead MoreImportance on Computer in Education1514 Words   |  7 PagesEducation Act of 1965 allocated money to bring new technology into schools, including computers. In 1975, Apple Computer first donated computers to schools, and by 1981 educational drill and practice programs were developed. By 1996, many schools were rewired for Internet access. Importance of Computer in Primary Schools The use of computer education in both public and private schools provides students with the technology skills required for college. Students in elementary schools begin learning theRead MoreTechnology Devices Should Not Be The Primary Way For Students987 Words   |  4 Pagestechnology. Nowadays, schools have technology devices like laptops or tablets that they use as a learning tool for students. Teachers and students use these devices in class for every subject. Text books have been replaced by the laptops or tablets, which are lend to each students in school and can take home. The devices have the apps for each of their subjects and students can read their text book in an e-book format. Additionally, students can do their homework on it, send messages to classmates or teacherRead MoreTechnology And Its Benefits For Students Learning Essay1517 Words   |  7 Pageseveryday life, however; many believe that technology in education is lagging behind other key sectors important t o society. The science curriculum is based on the Te Kete Ipurangi which outlines the guidelines for New Zealand curriculum. The importance of chemistry is that students should be able to make connections between concepts of chemistry and their applications and show an understanding of the role chemistry plays in the world around them. This research aims to develop a database for teachersRead MoreBehavior and Classroom Reward System713 Words   |  3 PagesClassroom reward systems provide teachers and students with guidelines to follow when dealing with behaviour. Every school has some form of behaviour management in place to deal with both good and bad behaviours and children with special needs who often need structure, planning and daily goals. Integrating technology into classroom reward systems, rewarding good behaviour, hard work or improvement, can have a positive impact on students, as Merrett, A., and Merrett, L. (2013) described, due to theRead MoreEducational Technologies Essay1374 Words   |  6 Pageseverything and found everywhere. Students are more likely to be using computers, tablets, hand-held game systems, or smart phones on a daily basis. Educators know that students crave the use of technology; consequently educators now need to start teaching keeping technology in mind. Educators also have to teach the students how to use specific technology and the ethics that go along with its use. An educator should start off with educational games and slowly work their way to full lessons. This is toRead MoreSocial Media Has Affected The Generation Of The 21st Century917 Words   |  4 PagesTechnology has become a growing industry, intriguing adolescences to explore the numerous different ways of interaction. Cell phones, lap tops, and tablets are new compelling products for teens, which allow effortless access to social media. A national survey in 2009 finds that 73% of online teenagers use SNS (social network systems), which is an increase from 55% 3 years earlier. (Lenhart, Purcell, Smith, Zickuhr, 2010) As technology became an everyday use, more people became distracted by socialRead MoreWhat You Have Learned?1393 Words   |  6 PagesSignature Assignment: Apply What You Have Learned Students are interested in learning when school is fun and the educational programs are designed to be engaging. Many schools are embracing technology that actively involve students in learning. The programs are designed to reach out to students and grasp their attention. Students from all different academic abilities can learn and express their knowledge through their use of digital technology. Advancements in technology is allowing for moreRead MoreThe Importance Of Artificial Intelligence2225 Words   |  9 Pagessuch as people, tools, policies, especially the learning environment. Learning areas are not limited to the classroom; it could happen at anywhere, such as the library, home, garden, etc. Therefore, because it is the 21st century, we also need to learn 21st-century skills. Pellegrino and Hilton (2012) mentioned that 21st-century skills include cognitive abilities (nonroutine problem solving, system thinking, and critical thinking), interpersonal skills (range from active listening, to presentation