Exam 70-473 Designing and Implementing Cloud Data Platform Solutions

Published: October 27, 2015
Languages: English
Audiences: IT Professionals
Technology: Microsoft Azure
Credit toward certification: Specialist

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.

If you have other questions or feedback about Microsoft Certification exams or about the certification program, registration, or promotions, please contact your Regional Service Center.

Design and implement database solutions for Microsoft SQL Server and SQL Database (20–25%)
Design a hybrid SQL Server solution
Design Geo/DR topology, design a data storage architecture, design a security architecture, design a data load strategy
Implement SQL Server on Azure Virtual Machines (VMs)
Provision SQL Server in an Azure VM, configure firewall rules, configure and optimize storage, migrate an on-premises database to Microsoft Azure, configure and optimize VM sizes by workload
Design a SQL Database solution
Design a solution architecture, design Geo/DR topology, design a security architecture, design a data load strategy, determine the appropriate service tier
Implement SQL Database
Provision SQL Database, configure firewall rules, configure active geo-replication, migrate an on-premises database to SQL Database, configure for scale and performance
Design and implement data warehousing on Azure
Design a data warehousing solution on Azure, design a data load strategy and topology, configure SQL Data Warehouse, migrate an on-premises database to SQL Data Warehouse

Manage database management systems (DBMS) security (25–30%)
Design and implement SQL Server Database security
Configure firewalls; manage logins, users, and roles; assign permissions; configure auditing; configure transparent database encryption
Implement Azure SQL Database security
Configure firewalls; manage logins, users, and roles; assign permissions; configure auditing; configure row-level security; configure data encryption; configure data masking; configure Always Encrypted

Design for high availability, disaster recovery, and scalability (25–30%)
Design and implement high availability solutions
Design a high availability solution topology, implement high availability solutions between on-premises and Azure, design cloud-based backup solutions, implement backup and recovery strategies
Design and implement scalable solutions
Design a scale-out solution, implement multi-master scenarios with database replication, implement elastic scale for SQL Database
Design and implement SQL Database data recovery
Design a backup solution for SQL Database, implement self-service restore, copy and export databases

Monitor and manage database implementations on Azure (25–30%)
Monitor and troubleshoot SQL Server VMs on Azure
Monitor database and instance activity, monitor using dynamic management views (DMVs) and dynamic management functions (DMFs), monitor performance and scalability
Monitor and troubleshoot SQL Database
Monitor and troubleshoot SQL Database, monitor database activity, monitor using DMVs and DMFs, monitor performance and scalability
Automate and manage database implementations on Azure
Manage SQL Server in Azure VMs with PowerShell, manage Azure SQL Database with PowerShell, configure Automation and Runbooks

Click here to view complete Q&A of 70-473 exam
Certkingdom Review

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft 70-473 Training at certkingdom.com

 

Exam 70-470 Recertification for MCSE: Business Intelligence

Published: August 10, 2014
Languages: English, Japanese
Audiences: IT professionals
Technology: Microsoft SQL Server 2014
Credit toward certification: MCP, MCSE

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.

If you have other questions or feedback about Microsoft Certification exams or about the certification program, registration, or promotions, please contact your Regional Service Center.

Build an analysis services multidimensional database
Implement a cube
Use SQL Server Data Tools – Business Intelligence (SSDT-BI) to build the cube; use SSDT-BI to do non-additive or semi-additive measures in a cube, define measures, specify perspectives, define translations, define dimension usage, define cube-specific dimension properties, define measure groups, implement reference dimensions, implement many-to-many relationships, implement fact relationships, implement role-playing relationships, create and manage linked measure groups and linked dimensions, create actions
Implement custom logic in a data model
Define key performance indicators (KPIs); define calculated members; create relative measures (growth, YoY, same period last year), percentage of total using MDX; named sets; add Time Intelligence; implement ranking and percentile; define MDX script to import partial PowerPivot model
Select an appropriate model for data analysis
Select Tabular versus Multidimensional based on scalability needs, traditional hierarchical, data volume; select appropriate organizational BI, such as corporate BI or PowerBI, and team and personal BI needs and data status

Manage, maintain, and troubleshoot a SQL Server Analysis Services (SSAS) database
Process data models
Define processing of tables or partitions for tabular and multidimensional models; define processing of databases, cubes, and dimensions for multidimensional models; select full processing versus incremental processing; define remote processing; define lazy aggregations; automate with Analysis Management Objects (AMO) or XML for Analysis (XMLA); process and manage partitions by using PowerShell
Install and maintain an SSAS instance
Install SSAS; install development tools; identify development and production installation considerations; upgrade SSAS instance; define data file and program file location; plan for Administrator accounts; define server and database level security; support scale-out read-only; update SSAS (service packs); install and maintain each instance type of Analysis Services, including PowerPivot; restore and import PowerPivot; back up and restore by using PowerShell

Build a tabular data model
Implement a tabular data model
Define tables, import data, define calculated columns, define relationships, define hierarchies and perspectives, manage visibility of columns and tables, embed links, optimize BISM for Power View, mark a date table, sort a column by another column
Implement data access for a tabular data model
Manage partitions, processing, select xVelocity versus DirectQuery for data access

Build a report with SQL Server Reporting Services (SSRS)
Design a report
Select report components (crosstab report, Tablix, design chart, data visualization components), design report templates (Report Definition Language), identify the data source and parameters; design a grouping structure; drill-down reports, drill-through reports; determine if any expressions are required to display data that is not coming directly from the data source
Manage a report environment
Manage subscriptions and subscription settings; define data-driven subscriptions; manage data sources; integrate SharePoint Server; define email delivery settings; manage the number of snapshots; manage schedules, running jobs, and report server logs; manage report server databases; manage the encryption keys; set up the execution log reporting; review the reports; configure site-level settings; design report lifecycle; automate management of reporting services; create a report organization structure; install and configure reporting services; deploy custom assemblies
Configure report data sources and datasets
Select appropriate query types (stored procedure versus table versus text only); configure parameterized connection strings (dynamic connection strings); define filter location (dataset versus query); configure data source options, for example, extract and connect to multiple data sources; shared and embedded data sources and datasets; use custom expressions in data sources; connect to Microsoft Azure SQL database; connect to Microsoft Azure Marketplace; implement DAX and MDX queries to retrieve appropriate data sets; work with non-relational data sources, such as XML or SharePoint lists; connect to HDInsight Server

Plan business intelligence (BI) infrastructure
Plan for performance
Optimize batch procedures: extract, transform, load (ETL) in SQL Server Integration Services (SSIS)/SQL and processing phase in Analysis Services; configure Proactive Caching within SQL Server Analysis Services (SSAS) for different scenarios; understand performance consequences of named queries in a data source view; analyze and optimize performance, including Multidimensional Expression (MDX) and Data Analysis Expression (DAX) queries; understand the difference between partitioning for load performance versus query performance in SSAS; appropriately index a fact table; optimize Analysis Services cubes in SQL Server Data Tools; create aggregations

Design BI infrastructure
Design a high availability and disaster recovery strategy
Design a recovery strategy, back up and restore SSAS databases, back up and restore SSRS databases, move and restore the SSIS Catalog, design an AlwaysON solution

Design a reporting solution
Design a Reporting Services dataset
Design appropriate data query parameters, create appropriate SQL queries, create appropriate DAX queries for an application, manage data rights and security, extract data from analysis services by using MDX queries, balance query-based processing versus filter-based processing, manage data sets through the use of stored procedures
Manage Excel Services/reporting for SharePoint
Configure data refresh schedules for PowerPivot published to SharePoint, publish BI info to SharePoint, use SharePoint to accomplish BI administrative tasks, install and configure Power View, publish PowerPivot and Power View to SharePoint
Design BI reporting solution architecture
Linked drill-down reports, drill-through reports, and sub reports; design report migration strategies; access report services API; design code-behind strategies; identify when to use Reporting Services (RS), Report Builder (RB), or Power View; design and implement context transfer when interlinking all types of reports (RS, RB, Power View, Excel); implement BI tools for reporting in SharePoint (Excel Services versus PowerView versus Reporting Services); select a subscription strategy; enable Data Alerts; design map visualization

Design BI data models
Design the data warehouse
Design a data model that is optimized for reporting; design and build a cube on top; design enterprise data warehouse (EDW) and OLAP cubes; choose between natural keys and surrogate keys when designing the data warehouse; use SQL Server to design, implement, and maintain a data warehouse, including partitioning, slowly changing dimensions (SCD), change data capture (CDC), Index Views, and column store indexes; identify design best practices; implement a many-to-many relationship in an OLAP cube; design a data mart/warehouse in reverse from an Analysis Services cube; implement incremental data load; choose between performing aggregation operations in the SSIS pipeline or the relational engine
Design cube architecture
Partition cubes and build aggregation strategies for the separate partitions; design a data model; choose the proper partitioning strategy for the data warehouse and cube; design the data file layout; identify the aggregation method for a measure in a MOLAP cube; performance tune a MOLAP cube using aggregations; design a data source view; design for cube drill-through and write back actions; choose the correct grain of data to store in a measure group; design analysis services processing by using indexes, indexed views, and order by statements

Design an ETL solution
Design SSIS package execution
Use the new project deployment model; pass values at execution time; share parameters between packages; plan for incremental loads versus full loads; optimize execution by using Balanced Data Distributor (BDD); choose optimal processing strategy (including Script transform, flat file incremental loads, and Derived Column transform)
Plan to deploy SSIS solutions
Deploy the package to another server with different security requirements, secure integration services packages that are deployed at the file system, demonstrate awareness of SSIS packages/projects and how they interact with environments (including recoverability), decide between performing aggregation operations in the SSIS pipeline or the relational engine, plan to automate SSIS deployment, plan the administration of the SSIS Catalog database

QUESTION 1
You need to identify the reports that produce the errors that Marc is receiving.
What should you do?

A. Write a query by using the Subscriptions table in the report server database.
B. Use the Windows Event Viewer to search the Application log for errors.
C. Write a query by using the ExecutionLog3 view in the report server database.
D. Search the ReportServerService_<timestamp>.log file for errors.

Answer: C


QUESTION 2
You need to deploy the StandardReports project.
What should you do? (Each correct answer presents a complete solution. Choose all that apply.)

A. Deploy the project from SQL Server Data Tools (SSDT).
B. Use the Analysis Services Deployment utility to create an XMLA deployment script.
C. Use the Analysis Services Deployment wizard to create an MDX deployment script.
D. Use the Analysis Services Deployment wizard to create an XMLA deployment script.

Answer: A,D

Explanation: There are several methods you can use to deploy a tabular model project. Most of the deployment methods that can be used for other Analysis Services projects, such as multidimensional, can also be used to deploy tabular model projects.
A: Deploy command in SQL Server Data Tools
The Deploy command provides a simple and intuitive method to deploy a tabular model project from the SQL Server Data Tools authoring environment.
Caution:
This method should not be used to deploy to production servers. Using this method can overwrite certain properties in an existing model.
D: The Analysis Services Deployment Wizard uses the XML output files generated from a Microsoft SQL Server Analysis Services project as input files. These input files are easily modifiable to customize the deployment of an Analysis Services project. The generated deployment script can then either be immediately run or saved for later deployment.
Incorrect:
not B: The Microsoft.AnalysisServices.Deployment utility lets you start the Microsoft SQL Server Analysis Services deployment engine from the command prompt. As input file, the utility uses the XML output files generated by building an Analysis Services project in SQL Server Data Tools (SSDT).


QUESTION 3
You need create the data source view for the StandardReports project.
What should you do?

A. Generate a relational schema from the dimensions and cubes by using the Schema Generation wizard.
B. Create a data source, connect it to the data warehouse, and then use the Data Source View wizard.
C. Execute the Import from Table wizard and then use the Data Source View wizard.
D. Create a new data source view and then use the Import from Table wizard.

Answer: B


QUESTION 4
You need to ascertain why Marc did not receive his reports.
What should you do?

A. Search the ReportServerService_<timestamp>.log file for errors.
B. Search the registry for errors.
C. Use the Windows Event Viewer to search the Application log for errors.
D. Use SQL Server Management Studio to search the SQL Server logs for errors.

Answer: B


QUESTION 5
You need to create a measure for DOD sales.
What should you do? (Each correct answer presents part of the solution. Choose all that apply.)

A. Specify a date table by using a Mark as Date table.
B. Use the Data Analysis Expressions (DAX) PARALLELPERIOD() function.
C. Use the Business Intelligence Wizard to define time intelligence.
D. Use the Multidimensional Expressions (MDX) LAG() function.

Answer: A,C

Explanation: * From scenario:
A measure must be created to calculate day-over-day (DOD) sales by region based on order date.
A: Specify Mark as Date Table for use with Time Intelligence (SSAS Tabular)
In order to use time intelligence functions in DAX formulas, you must specify a date table and a unique identifier (datetime) column of the Date data type. Once a column in the date table is specified as a unique identifier, you can create relationships between columns in the date table and any fact tables.
C: The time intelligence enhancement is a cube enhancement that adds time calculations (or time views) to a selected hierarchy. This enhancement supports the following categories of calculations:
Period to date.
Period over period growth. Moving averages.
Parallel period comparisons.

Click here to view complete Q&A of 70-470 exam
Certkingdom Review

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft 70-470 Training at certkingdom.com

DevOps and deviance: How bad IT practices become accepted as normal

What IT can learn about the study of the “normalization of deviance” phenomena

Peter Waterhouse, Senior Strategist, CA Technologies

Although vendor-written, this contributed piece does not promote a product or service and has been edited and approved by Network World editors.
Salary survey 1
Salary Survey 2016: How does your compensation stack up?

Computerworld’s annual IT Salary Survey results are in. Find out what your peers said about their
Read Now

How many times have you witnessed a sub-optimal IT practice that everyone else thinks is ok, then over time accepted the behavior as being just fine and dandy?

Regardless of whether you lead a startup or work in an established business, we all have a tendency to accept dodgy behaviors. Even if outsiders see them as wrong, our IT teams are so accustomed to using them (without any adverse consequences) that they’re quickly established as “normal” and accepted.

Studies into what’s commonly referred to as the “normalization of deviance” have been conducted in areas such healthcare to aerospace, with evidence showing that many serious errors and disasters occur because established standards have been bypassed and bad practices “normalized”.

While examining this phenomena is critical in the context of safety, it’s equally applicable in how we develop, secure and operate software applications. With the boundaries blurred between the digital and physical world, any adverse behavior leading to security and reliability issues could have dire consequences for customers. And when software becomes infused into long lasting products (from light bulbs to limousines) it’s not so easy to exit markets.

As businesses look to software innovation for growth, time-to-market and quality become essential differentiators. Unfortunately both can be compromised if pre-existing change aversion or newer “speed at all cost” mandates lead to a normalization of deviance. More critically, if a head-in-the-sand IT culture persists, systemic business failures may eventuate – think massive security breaches or major application outages.

The DevOps movement, with its focus on collaboration across development and other IT functions, is now regarded as the best way of establishing the culture and environment needed to support fast and reliable software deliver.  So maybe the secret to helping IT identify and eliminate poor practices is to take the benefits of DevOps and then guidance from other fields that are fighting normalization of deviance.

In healthcare, for example, studies illustrate seven factors that lead to a normalization of deviance, all of which are IT relevant:

The rules are stupid and inefficient – in healthcare, accidents occur when practitioners disable equipment warning systems because alarms are seen as distracting. This happens in IT all the time, like in operations where staff will filter out noise and alerts they regard as irrelevant. It also surfaces when testing is skipped because of manual processing and set-up delays.

Knowledge is imperfect and uneven – employees might not know a rule exists, or they might be taught a practice not realizing that it’s sub-optimal. In IT this persists because many new employees feel uncomfortable asking for help, or when the application of new technologies distort logical thinking.

The work itself, along with new technology, can disrupt work behaviors – to support goals of more continuous software delivery, organizations areintroducing many new technologies and methods – like Microservices and containers. New work practices and learning demands may lead staff to poorly implement technology or use it to perform functions it was never designed for.

We’re breaking rules for the good of the business – staff may bypass rules and good practice when they’re incentivized on faster delivery times or delivering new functional software enhancements. For example, repeatedly procuring additional (but unnecessary) hardware to rush through an update, rather than addressing the root-cause of performance problems.

The rules don’t apply to us…trust us – autonomous agile teams are beneficial, but empowering them to select their own one-off tools or to bypass compliance policies can compromise program objectives or lead to security breaches. Unfortunately in today’s fast-paced digital business, talented professionals often feel completely justified in playing the trust card.

Employees are afraid to speak up – violations become normal when employees stay silent. How many times has poor software code, costly projects (and bad managers) been tolerated because people are afraid to speak up? Even in IT organizations that have a strong blameless culture, people will stay quiet for fear of appearing “mean”.

Leaders withhold or dilute findings on application problems – whether you work in healthcare or IT, no-one wants to look bad to managers. Rather than present ugly news, many will distort the truth; presenting diluted or misleading information up the command chain. In IT this behavior is easily normalized, especially if teams get away with reporting technical vanity metrics over business outcomes.

No sudden cultural reawakening in IT or liberal sprinkling of collaboration fairy – dust will eliminate ingrained bad practices, but DevOps and Lean thinking can help identify warning signals. This starts with leaders visualizing the flow of value delivered by software applications, pinpointing all the bottlenecks and constraints impeding delivery.

Analogous to pathway stepping stones, these are all the value interrupts which, when lifted, reveal all the process and technology issues causing good people to do the wrong things. Immediate candidates are software release and testing processes, but don’t restrict analysis to the development side of the software factory. Every stone, be that enterprise architecture, stakeholder engagement, vendor management, operations or customer support can hide ugly behaviors that over time can become normalized.

Of course, identification is just the start. Next comes the hard part, with leaders using evidence to impress how behaviors impact current performance and business outcomes. This might involve using new tools, but this again courts disaster when advanced technologies becomes a vehicle to automate bad processes.

As with anything involving people, the organizational and psychological barriers encouraging staff to break rules or for their colleagues to remain silent is where most attention should be focused.

 

Click here to view complete Q&A of 70-467 exam
Certkingdom Review

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft 70-467 Training at certkingdom.com