[Free] 2019(Nov) EnsurePass Microsoft AI-100 Dumps with VCE and PDF 1-10

Get Full Version of the Exam
http://www.EnsurePass.com/AI-100.html

Question No.1

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have Azure IoT Edge devices that generate streaming data.

On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.

Solution: You deploy Azure Functions as an IoT Edge module. Does this meet the goal?

  1. Yes

  2. No

Correct Answer: B

Explanation:

Instead use Azure Stream Analytics and REST API.

Note:

Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent.

Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints.

References:

https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning- anomaly-detection

Question No.2

DRAG DROP

You are designing an AI solution that will use IoT devices to gather data from conference attendees, and then later analyze the data. The IoT devices will connect to an Azure IoT hub.

You need to design a solution to anonymize the data before the data is sent to the IoT hub.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

image

Correct Answer:

image

Question No.3

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You are deploying an Azure Machine Learning model to an Azure Kubernetes Service (AKS) container.

You need to monitor the accuracy of each run of the model. Solution: You configure Azure Monitor for containers.

Does this meet the goal?

  1. Yes

  2. No

Correct Answer: B

Question No.4

DRAG DROP

You have a container image that contains an Al solution. The solution will be used on demand and will only be needed a few hours each month.

You plan to use Azure Functions to deploy the environment on-demand.

You need to recommend the deployment process. The solution must minimize costs.

Which four actions should you recommend Azure Functions perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

image

Correct Answer:

image

Question No.5

A data scientist deploys a deep learning model on an Fsv2 virtual machine. Data analysis is slow.

You need to recommend which virtual machine series the data scientist must use to ensure that data analysis occurs as quickly as possible.

Which series should you recommend?

  1. ND

  2. B

  3. DC

  4. Ev3

Correct Answer: A

Explanation:

The N-series is a family of Azure Virtual Machines with GPU capabilities. GPUs are ideal for compute and graphics-intensive workloads, helping customers to fuel innovation through scenarios like high-end remote visualisation, deep learning and predictive analytics.

The ND-series is focused on training and inference scenarios for deep learning. It uses the NVIDIA Tesla P40 GPUs. The latest version – NDv2 – features the NVIDIA Tesla V100 GPUs.

References:

https://azure.microsoft.com/en-in/pricing/details/virtual-machines/series/

Question No.6

You are designing an Al application that will perform real-time processing by using Microsoft Azure Stream Analytics.

You need to identify the valid outputs of a Stream Analytics job.

What are three possible outputs? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point.

  1. a Hive table in Azure HDInsight

  2. Azure SQL Database

  3. Azure Cosmos DB

  4. Azure Blob storage

  5. Azure Redis Cache

Correct Answer: BCD

Explanation:

https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputs

Question No.7

You create an Azure Machine Learning Studio experiment.

You plan to publish the experiment as a Machine Learning Web service.

You need to ensure that you can consume the web service from Microsoft Excel spreadsheets. What should you use?

  1. a Batch Execution Service (BES) and an Azure managed identity

  2. a Request-Response Service (RRS) and an Azure managed identity

  3. a Request-Response Service (RRS) and an API key

  4. a Batch Execution Service (BES) and an API key

Correct Answer: C

Explanation:

Steps to Add a New web service:

  1. Deploy a web service or use an existing Web service.

  2. Click Consume.

  3. Look for the Basic consumption info section. Copy and save the Primary Key and the Request-Response URL.

  4. In Excel, go to the Web Services section (if you are in the Predict section, click the back arrow

    to go to the list of web services).

  5. Click Add Web Service.

  6. Paste the URL into the Excel add-in text box labeled URL.

  7. Paste the API/Primary key into the text box labeled API key.

  8. Click Add.

References:

https://docs.microsoft.com/en-us/azure/machine-learning/studio/excel-add-in-for-web-services

Question No.8

Your company plans to deploy an AI solution that processes IoT data in real-time.

You need to recommend a solution for the planned deployment that meets the following requirements:

Sustain up to 50 Mbps of events without throttling. Retain data for 60 days.

What should you recommend?

  1. Apache Kafka

  2. Microsoft Azure IoT Hub

  3. Microsoft Azure Data Factory

  4. Microsoft Azure Machine Learning

Correct Answer: A

Explanation:

Apache Kafka is an open-source distributed streaming platform that can be used to build real- time streaming data pipelines and applications.

References:

https://docs.microsoft.com/en-us/azure/hdinsight/kafka/apache-kafka-introduction

Question No.9

HOTSPOT

You are designing a solution that will analyze bank transactions in real time. The transactions will be evaluated by using an algorithm and classified into one of five groups. The transaction data will be enriched with information taken from Azure SQL Database before the transactions are sent to the classification process. The enrichment process will require custom code. Data from different banks will require different stored procedures.

You need to develop a pipeline for the solution.

Which components should you use for data ingestion and data preparation? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

image

Correct Answer:

image

Question No.10

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You are deploying an Azure Machine Learning model to an Azure Kubernetes Service (AKS) container.

You need to monitor the accuracy of each run of the model. Solution: You configure Azure Application Insights.

Does this meet the goal?

  1. Yes

  2. No

Correct Answer: A

Get Full Version of the Exam
AI-100 Dumps
AI-100 VCE and PDF

Leave a Reply

Your email address will not be published. Required fields are marked *

Proudly powered by WordPress