Ethical+and+Legal+Issues

Ethical and Legal Issues Related to Robotic Technology

There are a lot of ethical and legal issues dealing with robotics. Questions like, is it safe? What are the risks? Is it reliable? These are some of many questions that are being asked when it comes to robotics. Robotics are being used today to serve many purposes. They serve purposes such as medical, military, and environmental and other areas to numerous to mention. A lot of people have some ethical issues with the use of robotics in our daily lives, and others say it’s beneficial to our way of living. We know live in a world surrounded by technology. We basically use it from the first thing we do in the morning to the last of the day. Today, the major we’re facing is related to military robotics. The military has always been a leader in technological research and development. Their rules when it comes to robotics are a little different from the civilian world. In the civilian world, robots are constantly being tested to ensure that they don’t go hay wired and start killing people. But in the military world, that’s what the robots are being programmed to do. In this causes the ethical issue of how robots are being used in the military world from the civilian world.

Aside from the military world, robots today raise the questions about having them babysitting our kids or even as associates to the elderly, instead of humans. Since the Industrial Revolution, job displacement and economic impacts have been huge concerns with any new technology that comes out. Even in medical world, specifically surgical robots raise some very concerning issues that coincide with liability or responsibilities of robotics in the medical fields. Robots can have the same risk as computers, and if an error occurs while the surgical robot is performing some type of surgery, this problem can’t be easily fixed, and some fear that human’s loss that skill of performing a successful surgery. And given continuing angst about privacy, robots present the same risk that computers do (that is, “traitorware” that captures and transmits user information and location without our knowledge or consent), if not a greater risk given that we may be more trusting of an anthropomorphized robot than a laptop computer. (Myers, 2010)

Modern medical concerns with telemedicine and robotics practiced across national or other jurisdictional boundaries engage the historical, complex area of law called conflict of laws. An initial concern is whether a practitioner licensed only in jurisdiction A who treats a patient in jurisdiction B violates B's laws. Further concerns are whether a practitioner in A who violates a contract or treats a patient in B negligently incurs liability in B, A, or both, and, if treatment lawful in A is unlawful in B, whether the practitioner commits a crime. Judicial procedures are set by courts in which proceedings are initiated, but courts may decline jurisdiction due to inconvenience to parties. If courts accept jurisdiction, they may apply their own substantive legal rules, but may find that the rules of a conflicting jurisdiction should apply. Cross-border care should not change usual medical ethics, for instance on confidentiality, but may mitigate or aggravate migration of specialists. (Dickens & Cook, 2006).

Medical Ethics:
Robotics serve the ethical principle, or duty, of beneficence, in that way widen the capacity of practitioners of many medical disciplines to make their services available in the areas they cannot feasibly reach in person. These technologies can thereby mitigate the shortage of medical specialists in undeserved regions and countries. Against this benefit, however, is the risk that these technologies may aggravate migration of medical specialists from low resources areas, by affording them means to serve the countries or areas they leave, by electronic and robotic technologies (Bramlet, 2005,). The movement of physicians from poor to rich countries appears a growing obstacle to global health, increasing the inequities of such countries as the Us,UK, Canada and Australia recruiting medical graduates and specialists from lower-income countries. Regardless the system under which the physician is operating, the principles of medical ethics globally binding upon the medical profession must never be compromised (Robotic Technology Systems, 2007). These include such matter as ensuring confidentiality, reliability of equipment, the offering opinions only when possessing necessary information, and contemporaneous record keeping.

Confidentiality is a risk due to means of electronic eavesdropping, but special care is needed to prevent inadvertent copying of communications such as diagnoses. Care must also be taken to ensure that non physician intermediaries who collect ans transmit data about patients, such as medical technicians, observe confidentiality. Some disclosures can be of special sensitivity, such as transmissions of ultrasound scans that show fetal sex or abnormality, and diagnoses, for instance, of cervical cancer. Where patients are treated or monitored in their homes, family members may become, and perhaps need to become, involved in their case, as much as they can, that local personnel have informed patients of this, and obtained their agreement. Ethics and law coincide regarding encouragement of cross-border recognition of medical licensure and specialist accreditation, and of harmonization of standards of professional practice, so that jurisdictional differences will no obstruct patients' access to robotic surgical services. Similarly, commercial insurance and professional self-defense association arrangements should be adjusted to the realities of medical globalization, and not obstruct the upgrading of care that technology can afford patients in medically deprived settings. Spreading of benefits depends in part on the spreading of risk. Furthermore, courts courts addressing conflict of laws issues should take care to avoid dysfunctional decisions that would deter practitioners from providing cross -border services in unfamiliar legal territory (Hogan, 2000).

Military Ethics:
Robotics within the military has been prevalent for quite some time. Drones are an example of this, which enables engaging on selected targets without the presents of a human being. Unmanned aerial vehicles are part of both the United States Air Force and Navy since the early 1900’s. During this time frame there were many attempts to include robotics into the service and fast-forward to today it plays a critical role in the service. Convenience is the main purpose of this collaboration, however this purpose could become a disadvantage. Lin Patrick, who is a research director for Nanoethics, touched bases on this issue in an article titled “Military 2.0: Ethical Blowback From Emerging Technologies”. He states that a robot in the military provides this illusion of conveniences, however this may lead to an array of other issues. One issue is that it many create a comfort to engage in war rather then deterring from this decision. What happens when both parties use robotics to engage in war? What are some of the parameters that will be created to draw the line or to maintain boundaries? What would be considered ethical when using machinery to engage in this activity? These decisions can become very complex when it pertains to activities that could endanger others lives from a military standpoint.

Rules of engagement are another issue that must be considered as well. How will a robot know when it is right to engage on a particular enemy such as a six year old that has a none lethal object. As mentioned by Peter Warren Singer, who is an American political scientist, “We must create a system of rules of engagement for the robots. Utilizing robots as a form of defense also escalates another argument with ethics compared to the time when gunpowder was used. Peter stated that gunpowder was frowned upon because it was known as a cowardly technology that should be outlawed due to the fact that you do not have to fight face to face. It basically was looked at as an unfair way to fight. New technologies, especially robotics arises challenges to existing understandings.

=Robotics and Environment: =

Robotics are becoming more advanced every day, we are using robots in basically everything now, factories, medicine, vehicles, toys, and so on; robots are constantly being used to accomplish tasks that would be impossible or very hard for human beings to do. Sometimes we are presented with issues that affect our environment thus affecting all of us directly, and sometimes those issues could be hazardous for humans to resolve; this is when we resort to robotics. For example when we had the BP oil spill, it was impossible for humans to dive into such depth to fix the problems, so we sent remote controlled robots to fix the spill. When we had nuclear power plants melt down, we have resorted to robots to make the repairs because the radiation level would had killed any human.

There is no doubt that robots can be very beneficial to our environment, and that humans are little by little becoming more dependent to robots; but what many people don’t think about is about the fuel, or energy to make these robots work, and that that energy will end. So what will happen when all these robots run out of fuel or energy? We will have piles and piles of broken, rusty, dead robots. All those defective or broken robots will become more and more harmful to our environment. All that scrap metal will become harder to recycle, and will start rusting, not to mention all the harmful fossil fuels, or the battery powered robots that are so harmful to our ozone layer.

Transition statement: