CP3401/CP5636 Assessment Item 3 Emerging Technologies (Report: 30%) DUE Week 13 Sunday 11:59pm In this assessment, you will learn about emerging technologies and identify innovation opportunities and...

1 answer below »
Please read attached file for description - thank you.

CP3401/CP5636 Assessment Item 3 Emerging Technologies (Report: 30%) DUE Week 13 Sunday 11:59pm In this assessment, you will learn about emerging technologies and identify innovation opportunities and ethical issues when using these technologies. You are required to investigate one of following topics: 1) the top strategic technology trends from the Gartner Top Strategic Technology Trends for 2022 (https://www.gartner.com/en/information- technology/insights/top-technology-trends) or 2) the use of AI for decision making from Would You Let Artificial Intelligence Make Your Pay Decisions? (https://www.gartner.com/smarterwithgartner/would-you-let-artificial-intelligence-make-your-pay- decisions), and write a substantive report on the emerging technology of your choice (around 15 pages in length and 3,000-5,000 words without references). A typical structure of the report would look like: - Title (typically around 5-15 words) - Description of the technology (typically around 1,000 words) o What it is? o History o Current development state o etc - SWOT analysis of the technology (typically around 1,500 words) - Current uses and applications, including innovation and ethical issues (typically around 1,000 words) - Current maturity and future predictions (typically around 500 words) - References (IEEE style) ASSESSMENT TASK 3: CRITERIA SHEET Report (30%) • Description of the technology: 5% • SWOT analysis: 5% • Coverage (depth and breadth) of current uses and applications (ethical issues need to be discussed): 5% • Current maturity and future predictions: 5% • Structure & organisation: 5% • References (using IEEE referencing style): 5% - See: https://libguides.jcu.edu.au/IEEE https://www.gartner.com/en/information-technology/insights/top-technology-trends https://www.gartner.com/en/information-technology/insights/top-technology-trends https://www.gartner.com/smarterwithgartner/would-you-let-artificial-intelligence-make-your-pay-decisions https://www.gartner.com/smarterwithgartner/would-you-let-artificial-intelligence-make-your-pay-decisions https://libguides.jcu.edu.au/IEEE Rubric: Exemplary (9 – 10) Good (7 – 8) Satisfactory (5 – 6) Limited (2 – 4) Poor (0 – 1) Description of the technology Exhibiting outstanding level of description details including definition, history and current state Exhibits aspects of exemplary (left) and satisfactory (right) Exhibiting reasonable level of description details including definition, history and current state Exhibits aspects of satisfactory (left) and poor (right) Exhibiting simple level of description details including definition, history and current state SWOT analysis Exhibiting outstanding level of SWOT analysis clearly identifying all strengths, weaknesses, threats and opportunities Exhibiting reasonable level of SWOT analysis identifying most strengths, weaknesses, threats and opportunities Exhibiting simple level of SWOT analysis identifying few strengths, weaknesses, threats and opportunities Coverage of current uses and applications (including ethical issues) Exhibiting outstanding level of coverage identifying extensive breadth and depth of uses and applications Exhibiting reasonable level of coverage identifying breadth and depth of uses and applications Exhibiting limited level of coverage on uses and applications Current maturity and future prediction Exhibiting outstanding level of explanation, and outstanding level of future predictions based on sound justifications Exhibiting reasonable level of explanation, and reasonable level of future predictions based on acceptable justifications Exhibiting limited level of explanation, and limited level of future predictions with missing or unreasonable justifications Structure and organisation Outstanding structure covering all required sections and necessary information Reasonable structure covering most required sections and necessary information Illogical or not- coherent structure Reference Outstanding referencing using IEEE style without any errors Reasonable referencing using IEEE style with few errors Referencing using different referencing styles or many errors
Answered 3 days AfterSep 28, 2022James Cook University

Answer To: CP3401/CP5636 Assessment Item 3 Emerging Technologies (Report: 30%) DUE Week 13 Sunday 11:59pm In...

Ayan answered on Oct 01 2022
41 Votes
Table of contents
Description    3
History    4
Current development state    6
SWOT analysis    8
Current uses and applications    12
Current maturity and future predictions    15
References    18
In any association or institution, decisions can't just be dependent on past execution, essentially not in that frame of mind of plentiful information and competition. Each business strives to make the decisions that will increase their income. Organizations are using various elective approaches to aid in their decision-making as a result of technological improvements [1]. Diverse industries utilize different strategies, which are advantageous to the businesses. The use of analytics, artificial intelligence, and data mining technologies in decision-making is explored in this study. Making decisions is a characteristically human way of behaving with possible consequences. I
t is presumably not startling that researchers have attempted to improve and expand human capabilities through the development of PC technology. This goal has been accomplished in several applications because of advancements in artificial intelligence (AI). Intelligent decision support systems (IDSS), also known as AI-coordinated decision support systems, are increasingly used to help decision-making in various fields, including finance, healthcare, marketing, commerce, command and control, and cybersecurity [2]. Systems that in some manner impersonate human mental skills are said to as intelligent. AI tools are used by these systems to reason, learn, review, plan, and examine. By, for instance, surveying and selecting relevant data from very enormous and dispersed data sources, applying scientific tools to unstructured data, making summed up solutions from rule-sets and probabilities, and distinguishing associations in data from different sources that might impact a decision, AI tools can be used to expand human capabilities. At the point when used related to decision support systems, tools like Artificial Neural Networks, Fuzzy Logic, Intelligent Agents, Agent Teams, Case-Based Reasoning, Evolutionary Computing, and probabilistic reasoning can aid in the assessment. These systems are extremely useful for solving troublesome issues that are not unsurprising, incorporate uncertainty, and have a ton of data.
For researchers, the term "artificial intelligence" and the related technologies are not new. As opposed to what you would think, this technology is considerably older. Even in Greek and Egyptian tales, there are stories about mechanical men [3]. The following list of landmarks in AI history illustrates the progression from its inception to the present.
· Year 1943: Warren McCulloch and Walter Pits produced the initial work that is today known as AI in 1943. They put out a model of synthetic neurons.
· Year 1949: Donald Hebb developed a rule for updating the strength of connections between neurons. Hebbian learning is the modern name for his rule.
· The English mathematician Alan Turing, who invented machine learning in 1950, was born in that year. In his book "Computing Machinery and Intelligence," Alan Turing outlined a test. A Turing test can be used to determine if a computer is capable of behaving intelligently on par with a human.
· 1955 saw the creation of the "first artificial intelligence software," Logic Theorist, by Allen Newell and Herbert A. Simon. In addition to finding new and better proofs for some theorems, this software had proven 38 of 52 mathematical theorems.
· Year 1956: John McCarthy, an American computer scientist, coined the term "artificial intelligence" at the Dartmouth Conference. AI was originally recognised as a legitimate academic discipline.
· 1966: The researchers placed a strong emphasis on creating algorithms that can resolve mathematical puzzles. In 1966, Joseph Weizenbaum developed ELIZA, the first chatbot.
· The first intelligent humanoid robot, known as WABOT-1, was created in Japan in the year 1972.
· The first AI winter lasted between the years 1974 and 1980. The term "AI winter" describes a period of time when computer scientists struggled with a significant lack of government funding for AI research [4].
· 1980: Following a hiatus, AI returned with "Expert System." Expert systems that can make decisions like a human expert have been programmed.
· Year 1997: In the year 1997, IBM Deep Blue defeated Gary Kasparov, the reigning world chess champion, becoming the first machine to do so.
· Year 2002: The Roomba vacuum cleaner marked the debut of artificial intelligence (AI) in the household.
· Year 2006: AI first entered the business sphere in that year. Additionally, businesses like Facebook, Twitter, and Netflix began utilizing AI.
· Year 2011: In 2011, IBM's Watson won Jeopardy, a game show where contestants had to figure out tricky questions and riddles. Watson has demonstrated its ability to comprehend natural language and swiftly find answers to challenging problems.
· In the year 2012, Google introduced "Google Now," a feature for Android apps that might forecast information for users.
· The chatbot "Eugene Goostman" triumphed in the controversial "Turing test" competition in the year 2014.
· Year 2018: The IBM "Project Debater" excelled in a debate with two expert debaters on a variety of challenging themes.
Current development state
The results of the 2021 survey show that the use of AI is still increasing steadily: 56% of all respondents, up from 50% in 2020, report AI usage in somewhere around one capability. As per the most recent findings, businesses with headquarters in arising economies, such as China, the Middle East, and North Africa, have used AI all the more much of the time since last year: 57% of respondents, up from 45% in 2020, report adoption. Also, across all geographies, Indian businesses have the greatest adoption rates, closely followed by those in Asia-Pacific [5]. The business functions where AI adoption is most pervasive incorporate service operations, item and service development, and marketing and sales, as we observed in the previous two polls, but the most widely recognized use cases cover a scope of activities. The most significant rate point development in the usage of AI has been seen in businesses' marketing financial plan distribution and spending effectiveness, with service operations streamlining, AI-based item upgrade, and contact focus mechanization balancing the main three use cases. The outcomes also seem to highlight an extending impact of AI on financial results. Something like 5% of profits before interest and taxes (EBIT) revealed by respondents as being because of AI has developed year over year to 27% from 22% in the earlier survey. Moreover, prices have increased while income advantages from AI have remained the same or even dropped since the last survey, especially for supply-chain executives, where AI was probably not going to compensate for the worldwide supply-chain issues of the pandemic time. The biggest year-over-year changes in the shares detailing cost takeout from using AI in item and service development, marketing and sales, and strategy and corporate finance were accounted for by respondents, who announced significantly more noteworthy cost savings from AI than they did previously in each capability [6]. The prospects for AI, as per respondents, are still brilliant. Similar to the findings of the 2020 survey, nearly two-thirds of respondents anticipate that their firms' spending in AI will grow during the next three years.
As indicated by a [7] report, 82% of Americans use computerized payments, and online purchases of goods and services from US merchants have added up to $871.103 billion. The speed with which the BFSI sector adopts this shift in customer conduct and uses IoT, blockchain, machine learning (ML), artificial intelligence (AI), and other state-of-the-art technologies to improve and adjust their business models will decide how successful the advanced economy is. Artificial intelligence (AI) has reformed bill-paying and web-based purchasing, making the two processes more effective. As indicated by the 2019 Worldwide FinTech Adoption File, 25% of small and medium-sized enterprises use AI technology for banking, funding, and financial administration. Advanced banking driven by AI improves the client experience by speeding up and giving every minute of every day access to in-person customer support [7]. Because of smart IA and normal language processing, customers might speak with a chabot or menial helper instead of waiting to speak to a genuine person. PC vision is one more aspect of AI that is starting to be used in installment processing to supplant cash and Mastercards. Customers use mobile phones to make transactions, and transaction information is maintained on the web. Furthermore, financial institutions are starting to use PC vision to empower clients to make accounts online as opposed to in a traditional...

Answer To This Question Is Available To Download

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here