Part 1: Short Answer (30 points, 10 points/question)Part 1 presents you withfivequestions, and you must selectthreeto answer in3-5 sentences each. Your answers will be graded on your ability to...

1 answer below »


Part 1: Short Answer (30 points, 10 points/question)



Part 1 presents you withfivequestions, and you must selectthree
to answer in3-5 sentences each. Your answers will be graded on your ability to demonstrate your understanding of basic course concepts from lecture and readings, and to answer all parts of each question.



Please do not submit answers to more than three questions.





Part 2: Analysis (40 points, 20 points/question)





Please read the selection of text pasted into the Part 2 of the exam, then chooseone of twoquestions to answer from Part 2A and respond to the required question inPart 2B.



Please answer each question in Part 2 with a paragraph of at least5
sentences.You can write more than that, but please be as concise and direct as possible in your answer. Whenever you can, reference the text of the provided article to support your answers. The material presented in the selected text from the article pasted below should be enough of an introduction to the issue for you to answer the questions. You do not need to read the entirely of the original article (linked), nor do you need to do any more research on the topic.





sample exam format Sample exam format and questions   Part 1: Short Answer (30 points, 10 points/question) Part 1 presents you with five questions, and you must select three to answer in 3-5 sentences each. Your answers will be graded on your ability to demonstrate your understanding of basic course concepts from lecture and readings, and to answer all parts of each question.  Please do not submit answers to more than three questions.  Three sample questions (note that some of these questions refer to concepts we have not covered in class this semester. They are meant to give you an idea of the type of questions you can expect to encounter on the exam): • How is Facial Recognition Technology a form of “physiognomic AI,”  and what does this framing draw attention to? In your answer, please describe any relevant historical background to this claim and indicate its ethical relevance in the present. • What are the CARE principles, in what context did they originate, and what new concerns do they bring into responsible work with data? • Give an example of research in the datafied world discussed in this class where the formal expectation of “informed consent” was satisfied, but also not entirely adequate for understanding the situation or preventing harm. (Note: Do not choose the Tuskegee syphilis experiment.)   Part 2: Analysis (40 points, 20 points/question) Please read the selection of text pasted below, and respond to the required question in Part 2A and choose one of two questions to answer from Part 2B.  Please answer each question in Part 2 with a paragraph of at least 5 sentences. You can write more than that, but please be as concise and direct as possible in your answer. Whenever you can, reference the text of the provided article to support your answers. The material presented in the selected text from the article pasted below should be enough of an introduction to the issue for you to answer the questions. You do not need to read the entirely of the original article (linked), nor do you need to do any more research on the topic.  You will be graded on your ability to use material learned in class to answer the questions thoughtfully and clearly. Although the exam is open book/open note, you may not collaborate or discuss the exam with anyone else.    Part 2 Questions (please note that you'll need to scroll down and read the text pasted below in order to understand the following questions) 2A (required question) Imagine that the same program was being developed at UC Berkeley. What argument would you address to Campus Technology Officer (CTO) Bill Allison Links to an external site.  to express your position about Berkeley’s potential use of the program? Address both questions of privacy and governance.   1. In what ways is privacy at stake in this example? Which forms or aspects of privacy? And what are the limitations of “privacy” as a concept for assessing the ethical stakes of this program? That is, what are some of the major ethical concerns involved in this technology that cannot be captured by the concept of privacy? AND 2. What processes should a campus like UC Berkeley develop and enact if it wishes to consider responsibly implementing a program like this? Explain your answer using concepts and tools from the course.   2B (choose one of two options) 1. What ideas and conditions likely motivated the development of this program at George Washington University? What narratives, imaginaries, and expectations were involved on the part of campus administration? Who is represented and served by those narratives, imaginaries, and expectations? 2. How should the university assess the service vendor (in this case, Degree Analytics) and analyze its business model? What does Degree Analytics aim to gain from the partnership? (If you choose to answer this question, feel free to look over the vendor’s website Links to an external site. .) In answering Part 2B, use at least two tools from the HCE toolkit. If you so choose (this is not required), you can also use another historical example or recent case study to illustrate your argument.   Part 2 Article:  Please read the following Inside Higher Ed article before answering the questions in Part 2 of the exam. You can find the original article here Links to an external site. . Revelations that George Washington University launched a data- analytics pilot project last fall that monitored locations of students, faculty and staff without their knowledge or consent have raised new questions about data privacy on college campuses and shined a light on a project that deeply concerned many GW faculty members. GW president Mark S. Wrighton apologized for the incident in a campuswide email Links to an external site.  sent Feb. 11. He emphasized that the university did not analyze individualized data and said all data collected as part of the project would be destroyed. Wrighton said the project was meant to test how data analytics could help GW officials assess building density and use. Wrighton said he learned about the data collection project shortly after he started as president on Jan. 1. The project was spearheaded by the university’s IT, student affairs and safety and facilities divisions and collected data from Wi-Fi networks across GW’s campuses, Wrighton said. It was designed to help administrators better understand “density and use of buildings by students, faculty, and staff in the aggregate,” Wrighton wrote. A George Washington University spokeswoman said via email that administrators sought the information to “inform operational priorities during the pandemic.” The pilot program was first reported by The Washington Post. Isha Trivedi, a junior and the reporter who covered the story for The GW Hatchet, the campus newspaper, said students have told her they were surprised that GW collected data on students and felt that Wrighton’s email was unclear. She said many students also felt that Wrighton’s email lacked necessary context on the firm GW retained to collect the data and the company’s past work on campuses, particularly with helping institutions track students’ class attendance. “I don’t think we know really to what extent this data was being used and what that means for students,” Trivedi said. “I’ve seen people online saying this is awful and shouldn’t be happening.” Rory Mir, a grassroots advocacy organizer at the Electronic Frontier Foundation, said the ongoing pandemic has led more campuses to experiment with data analytics to track student and faculty locations. “A lot of companies are pitching the schools, ‘Hey, we can track the locations of students for COVID safety purposes,’ and most of those claims are kind of nonsense, because you can’t really do contact tracing with Wi-Fi, which is what a lot of them are claiming,” Mir said. Mir said data analytics are being used to track student behaviors in ways that are disturbing and which introduce potential bias. Students who are working off campus to put themselves through school might clock less time in the library or miss class more often, Mir said, making institutional reliance on metrics for how often they are present unfair. Universities are “using this big data to track behavioral issues of students … like how often they’re going to the library and how much time they spend on campus and trying to associate that with how well they’re performing in class,” Mir said. “It’s a huge privacy invasion.” Cristian Ponce, a freshman majoring in computer science at the California Institute of Technology and who has been active in grassroots data-privacy organizing, said he urges fellow students to be skeptical about data collection. He called the practice of data tracking for student class attendance “invasive.” “It just sets up a structure that is far too controlled, where administrators have these details and students aren’t taking agency over their own lives,” Ponce said. GW officials have emphasized that data collected during the pilot were “de-identified,” meaning identifying information was removed from it and the individual data were aggregated, or combined as a group of data. They acknowledged, however, that the campus IT department attached “descriptors” to the data, so they were not completely anonymized. “I want to be clear that even though the technical capacity may exist to track individuals across our campus, such a capacity was not utilized nor contemplated in this pilot and no individualized data tracking or movement across our campus was ever shared,” Wrighton wrote. “Regrettably, however, the university neglected to inform members of our community in advance of commencing this analytical project.” Mir said that even when de-identified data are aggregated, it is still possible Links to an external site.  to identify specific individuals Links to an external site.  in the data set. “With enough information, these systems can re-identify individuals just given the granularity of what has been collected,” Mir said. “And the more precise this data gets, the bigger the risk of de-anonymizing the people in the data set.” Mir said it is hard to know exactly how rampant this type of data collection is on college campuses, but that institutions are increasingly using Wi-Fi, key cards and other simple systems to track students, even if they are not relying on a system as sophisticated as what GW piloted. Aaron Benz, founder and CEO of Degree Analytics, which partnered with GW on the data-collection pilot, said institutions usually work with his company to enhance student success initiatives or to understand building density and use. Benz declined to comment on the GW situation, saying he does not discuss the work he does with specific clients. The GW student newspaper the Hatchet first reported Links to an external site.  the involvement of Degree Analytics in the campus data- collection effort. Benz said that since its founding in 2018, Degree Analytics has worked with about 25 colleges, about half of which have used the technology for more individualized data-collection purposes, such as tracking whether students are attending class. “Most professors don’t take attendance, yet it’s the No. 1 predictor of persistence and success,” Benz said. “If a student stops going to class, that’s the earliest indicator that they may drop out or fail out.” Degree Analytics offers a product it calls EnGauge Student Links to an external site. , which its website says allows institutions to gather “student behavioral metrics,” which will allow them to “analyze more student behaviors that better align with student success.” Among
Answered Same DayOct 09, 2022

Answer To: Part 1: Short Answer (30 points, 10 points/question)Part 1 presents you withfivequestions, and you...

Dr Raghunandan G answered on Oct 10 2022
55 Votes
ANSWERS
PART 1
QUESTION 1
1.A socio-technical system (STS) takes into account the needs of hardware, software, people, and the community as a whole. Sociotechnical systems (STSs) bring together groups of people and technology. La
rge technology companies and platforms are some of the best-known examples of STSs. Instances include tech giants like Microsoft and Apple along with social networking organizations such as Facebook, Instagram, and LinkedIn. Facebook and Twitter perform each of these functions in a sense.It's hard to separate these two platforms from the communities and technology that make them possible. Sociotechnical systems theory has made way for more holistic approaches as the number of organizations that use both technology and people has grown. As an example of an STS, let's use email providers. Think about how you could describe and analyze Gmail, Hotmail, Outlook, and other email systems. Each of these systems is different, but they all work together in an STS to make an email program that works.
2.Artificial intelligence which adheres to ethical standards ensures that people are treated with dignity and that no one is harmed. Artificial intelligence that adheres to standards of ethics guarantees that individuals are handled with respect and that no one is harmed.In the same manner that it amplifies positive ideas and practices, it also amplifies destructive ideas and practices. It can assist in eliminating bias and discrimination from human resource practices, for instance, but it can also propagate and amplify bias.Data privacy, informing while assessing moral considerations, key factors or presumptions include permission to use information, security and openness, computational justice and prejudices, and computational justice and preconceptions..
· Lack of visibility of AI tools: Humans do not always understand AI decisions.
· AI is not neutral: AI-based choices are prone to inaccuracy, discriminating consequences, and ingrained or introduced prejudice.
· Monitoring techniques that respect individual privacy and collect information.
· New worries about equity and threats to Human Rights and other fundamental values.
3.The Belmont Report emphasizes basic moral principles for using humanity in studies. In addition, it establishes rules to ensure that these...
SOLUTION.PDF

Answer To This Question Is Available To Download

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here