Tag Archives: Exam 70-458 videos

DevOps and deviance: How bad IT practices become accepted as normal

What IT can learn about the study of the “normalization of deviance” phenomena

Peter Waterhouse, Senior Strategist, CA Technologies

Although vendor-written, this contributed piece does not promote a product or service and has been edited and approved by Network World editors.
Salary survey 1
Salary Survey 2016: How does your compensation stack up?

Computerworld’s annual IT Salary Survey results are in. Find out what your peers said about their
Read Now

How many times have you witnessed a sub-optimal IT practice that everyone else thinks is ok, then over time accepted the behavior as being just fine and dandy?

Regardless of whether you lead a startup or work in an established business, we all have a tendency to accept dodgy behaviors. Even if outsiders see them as wrong, our IT teams are so accustomed to using them (without any adverse consequences) that they’re quickly established as “normal” and accepted.

Studies into what’s commonly referred to as the “normalization of deviance” have been conducted in areas such healthcare to aerospace, with evidence showing that many serious errors and disasters occur because established standards have been bypassed and bad practices “normalized”.

While examining this phenomena is critical in the context of safety, it’s equally applicable in how we develop, secure and operate software applications. With the boundaries blurred between the digital and physical world, any adverse behavior leading to security and reliability issues could have dire consequences for customers. And when software becomes infused into long lasting products (from light bulbs to limousines) it’s not so easy to exit markets.

As businesses look to software innovation for growth, time-to-market and quality become essential differentiators. Unfortunately both can be compromised if pre-existing change aversion or newer “speed at all cost” mandates lead to a normalization of deviance. More critically, if a head-in-the-sand IT culture persists, systemic business failures may eventuate – think massive security breaches or major application outages.

The DevOps movement, with its focus on collaboration across development and other IT functions, is now regarded as the best way of establishing the culture and environment needed to support fast and reliable software deliver.  So maybe the secret to helping IT identify and eliminate poor practices is to take the benefits of DevOps and then guidance from other fields that are fighting normalization of deviance.

In healthcare, for example, studies illustrate seven factors that lead to a normalization of deviance, all of which are IT relevant:

The rules are stupid and inefficient – in healthcare, accidents occur when practitioners disable equipment warning systems because alarms are seen as distracting. This happens in IT all the time, like in operations where staff will filter out noise and alerts they regard as irrelevant. It also surfaces when testing is skipped because of manual processing and set-up delays.

Knowledge is imperfect and uneven – employees might not know a rule exists, or they might be taught a practice not realizing that it’s sub-optimal. In IT this persists because many new employees feel uncomfortable asking for help, or when the application of new technologies distort logical thinking.

The work itself, along with new technology, can disrupt work behaviors – to support goals of more continuous software delivery, organizations areintroducing many new technologies and methods – like Microservices and containers. New work practices and learning demands may lead staff to poorly implement technology or use it to perform functions it was never designed for.

We’re breaking rules for the good of the business – staff may bypass rules and good practice when they’re incentivized on faster delivery times or delivering new functional software enhancements. For example, repeatedly procuring additional (but unnecessary) hardware to rush through an update, rather than addressing the root-cause of performance problems.

The rules don’t apply to us…trust us – autonomous agile teams are beneficial, but empowering them to select their own one-off tools or to bypass compliance policies can compromise program objectives or lead to security breaches. Unfortunately in today’s fast-paced digital business, talented professionals often feel completely justified in playing the trust card.

Employees are afraid to speak up – violations become normal when employees stay silent. How many times has poor software code, costly projects (and bad managers) been tolerated because people are afraid to speak up? Even in IT organizations that have a strong blameless culture, people will stay quiet for fear of appearing “mean”.

Leaders withhold or dilute findings on application problems – whether you work in healthcare or IT, no-one wants to look bad to managers. Rather than present ugly news, many will distort the truth; presenting diluted or misleading information up the command chain. In IT this behavior is easily normalized, especially if teams get away with reporting technical vanity metrics over business outcomes.

No sudden cultural reawakening in IT or liberal sprinkling of collaboration fairy – dust will eliminate ingrained bad practices, but DevOps and Lean thinking can help identify warning signals. This starts with leaders visualizing the flow of value delivered by software applications, pinpointing all the bottlenecks and constraints impeding delivery.

Analogous to pathway stepping stones, these are all the value interrupts which, when lifted, reveal all the process and technology issues causing good people to do the wrong things. Immediate candidates are software release and testing processes, but don’t restrict analysis to the development side of the software factory. Every stone, be that enterprise architecture, stakeholder engagement, vendor management, operations or customer support can hide ugly behaviors that over time can become normalized.

Of course, identification is just the start. Next comes the hard part, with leaders using evidence to impress how behaviors impact current performance and business outcomes. This might involve using new tools, but this again courts disaster when advanced technologies becomes a vehicle to automate bad processes.

As with anything involving people, the organizational and psychological barriers encouraging staff to break rules or for their colleagues to remain silent is where most attention should be focused.

 

Click here to view complete Q&A of 70-467 exam
Certkingdom Review

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft 70-467 Training at certkingdom.com

Exam 70-467 Designing Business Intelligence Solutions with Microsoft SQL Server

Published: June 11, 2012
Languages: English, Chinese (Simplified), French, German, Japanese, Portuguese (Brazil)
Audiences: IT professionals
Technology: Microsoft SQL Server
Credit toward certification: MCP, MCSE

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. The percentages indicate the relative weight of each major topic area on the exam. The higher the percentage, the more questions you are likely to see on that content area on the exam. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.

If you have other questions or feedback about Microsoft Certification exams or about the certification program, registration, or promotions, please contact your Regional Service Center.

As of February 18, 2016, this exam includes content covering both SQL Server 2012 and 2014. Please note that this exam does not include questions on features or capabilities that are present only in the SQL Server 2012 product. For more information, please download and review this document.

Plan business intelligence (BI) infrastructure (15–20%)
Plan for performance
Optimize batch procedures: extract, transform, load (ETL) in SQL Server Integration Services (SSIS)/SQL and processing phase in Analysis Services; configure Proactive Caching within SQL Server Analysis Services (SSAS) for different scenarios; understand performance consequences of named queries in a data source view; analyze and optimize performance, including Multidimensional Expression (MDX) and Data Analysis Expression (DAX) queries; understand the difference between partitioning for load performance versus query performance in SSAS; appropriately index a fact table; optimize Analysis Services cubes in SQL Server Data Tools; create aggregations
Plan for scalability
Change binding options for partitions; choose the appropriate Multidimensional OLAP (MOLAP), Relational OLAP (ROLAP), and Hybrid OLAP (HOLAP) storage modes
Plan and manage upgrades
Plan change management for a BI solution
Maintain server health
Design an automation strategy

Preparation resources
Create and manage a local partition
Partition storage modes and processing
Designing Aggregations (Analysis Services – Multidimensional)

Design BI infrastructure (15–20%)
Design a security strategy
Configure security and impersonation between SQL Server service, analysis services, and front end; implement Dynamic Dimension Security within a cube; configure security for an extranet environment; configure Kerberos security; design authentication mechanisms; design security tests; build secure solutions end to end; design roles for calculated measures; understand the tradeoffs between regular SSAS security and dynamic security
Design a SQL partitioning strategy
Choose the proper partitioning strategy for the data warehouse and cube, implement a parallel load to fact tables by using partition switching, use data compression
Design a high availability and disaster recovery strategy
Design a recovery strategy, back up and restore SSAS databases, back up and restore SSRS databases, move and restore the SSIS Catalog, design an AlwaysON solution
Design a logging and auditing strategy
Design a new SSIS logging infrastructure (for example, information available through the catalog views), validate data is balancing and reconciling correctly

Preparation resources
Granting cube access
Connection string properties
Partitioned tables and indexes

Design a reporting solution (20–25%)
Design a Reporting Services dataset
Design appropriate data query parameters, create appropriate SQL queries, create appropriate DAX queries for an application, manage data rights and security, extract data from analysis services by using MDX queries, balance query-based processing versus filter-based processing, manage data sets through the use of stored procedures
Manage Excel Services/reporting for SharePoint
Configure data refresh schedules for PowerPivot published to SharePoint, publish BI info to SharePoint, use SharePoint to accomplish BI administrative tasks, install and configure Power View, publish PowerPivot and Power View to SharePoint
Design a data acquisition strategy
Identify the data sources that need to be used to pull in the data, determine the changes (incremental data) in the data source (time window), identify the relationship and dependencies between the data sources, determine who can access which data, determine what data can be retained for how long (regulatory compliance, data archiving, aging), design a data movement strategy, profile source data
Plan and manage reporting services configuration
Choose the appropriate reporting services requirements (including native mode and SharePoint mode)
Design BI reporting solution architecture
Linked drill-down reports, drill-through reports, and sub reports; design report migration strategies; access report services API; design code-behind strategies; identify when to use Reporting Services (RS), Report Builder (RB), or Power View; design and implement context transfer when interlinking all types of reports (RS, RB, Power View, Excel); implement BI tools for reporting in SharePoint (Excel Services versus PowerView versus Reporting Services); select a subscription strategy; enable Data Alerts; design map visualization

Preparation resources
Configure Windows Authentication on the Report Server
Different ways to update data in PowerPivot
Add dataset filters, data region filters, and group filters (Report Builder and SSRS)

Design BI data models (30–35%)

Design the data warehouse
Design a data model that is optimized for reporting; design and build a cube on top; design enterprise data warehouse (EDW) and OLAP cubes; choose between natural keys and surrogate keys when designing the data warehouse; use SQL Server to design, implement, and maintain a data warehouse, including partitioning, slowly changing dimensions (SCD), change data capture (CDC), Index Views, and column store indexes; identify design best practices; implement a many-to-many relationship in an OLAP cube; design a data mart/warehouse in reverse from an Analysis Services cube; implement incremental data load; choose between performing aggregation operations in the SSIS pipeline or the relational engine
Design a schema
Multidimensional modeling starting from a star or snowflake schema, design relational modeling for a Data Mart
Design cube architecture
Partition cubes and build aggregation strategies for the separate partitions; design a data model; choose the proper partitioning strategy for the data warehouse and cube; design the data file layout; identify the aggregation method for a measure in a MOLAP cube; performance tune a MOLAP cube using aggregations; design a data source view; design for cube drill-through and write back actions; choose the correct grain of data to store in a measure group; design analysis services processing by using indexes, indexed views, and order by statements
Design fact tables
Design a data warehouse that supports many to many dimensions with factless fact tables
Design BI semantic models
Plan for a multidimensional cube; support a many-to-many relationship between tables; choose between multidimensional and tabular, depending on the type of data and workload
Design and create MDX calculations
Design MDX queries, identify the structures of MDX and the common functions (tuples, sets, TopCount, SCOPE, VisualTotals, and more), create calculated members in an MDX statement, identify which MDX statement would return the required result, implement a custom MDX or logical solution for a pre-prepared case task

Preparation resources
Defining a many-to-many relationship
Define Relationship dialog box (Analysis Services – multidimensional data)
Set aggregation options (Usage-Based Optimization Wizard)

Design an ETL solution (10–15%)
Design SSIS package execution
Use the new project deployment model; pass values at execution time; share parameters between packages; plan for incremental loads versus full loads; optimize execution by using Balanced Data Distributor (BDD); choose optimal processing strategy (including Script transform, flat file incremental loads, and Derived Column transform)
Plan to deploy SSIS solutions
Deploy the package to another server with different security requirements, secure integration services packages that are deployed at the file system, demonstrate awareness of SSIS packages/projects and how they interact with environments (including recoverability), decide between performing aggregation operations in the SSIS pipeline or the relational engine, plan to automate SSIS deployment, plan the administration of the SSIS Catalog database
Design package configurations for SSIS packages
Avoid repeating configuration information entered in SSIS packages, and use configuration files

Preparation resources
Data profiling task
Create package configurations
Deployment of projects and packages

QUESTION 1
You need to create the Sales Reporting shared SSRS data source.
Which SSRS data connection type should you use?

A. OData
B. Microsoft SQL Server
C. ODBC
D. OLE DB

Answer: B


QUESTION 2
You need to grant appropriate permissions to the SSISOwners SQL Server login.
What should you do?

A. Map the login to the SSISDB database. Assign the user to the ssis_admin role.
B. Map the login to the msdb database. Assign the user to the db_owner role.
C. Map the login to the msdb database. Assign the user to the db_ssisadmin role.
D. Map the login to the SSISDB database. Assign the user to the db_ssisadmin role.
E. Map the login to the SSISDB database. Assign the user to the db_owner role.
F. Map the login to the msdb database. Assign the user to the ssis_admin role.

Answer: D


QUESTION 3
You need to configure data refresh for the Manufacturing Performance PowerPivot workbook.
What should you do? (Each correct answer presents part of the solution. Choose ail that apply.)

A. Configure the PowerPivot Data Refresh Timer Job to run every 60 minutes.
B. Restore the PowerPivot workbook to an SSAS instance in tabular mode.
C. Script a process command and configure a SQL Server Agent job to execute the command every 60 minutes.
D. Restore the PowerPivot workbook to an SSAS instance in PowerPivot for SharePoint mode.

Answer: A


QUESTION 4
You need to configure package execution logging to meet the requirements.
What should you do?

A. Configure logging in each ETL package to log the OnError, OnInformation, and Diagnostic events.
B. Set the SSIS catalog’s Server-wide Default Logging Level property to Performance.
C. Set the SSIS catalog’s Server-wide Default Logging Level property to Basic.
D. Set the SSIS catalog’s Server-wide Default Logging Level property to Verbose.
E. Configure logging in each ETL package to log the OnError, OnPreExecute, and OnPostExecute events.

Answer: B

Click here to view complete Q&A of 70-467 exam
Certkingdom Review

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft 70-467 Training at certkingdom.com

Exam 70-458 Transition Your MCTS on SQL Server 2008 to MCSA: SQL Server 2012, Part 2

Published: June 11, 2012
Languages: English, Japanese
Audiences: IT professionals
Technology: Microsoft SQL Server 2012
Credit toward certification: MCP, MCSA, MCSE

Skills measured
This exam measures your ability to accomplish the technical tasks listed below. The percentages indicate the relative weight of each major topic area on the exam. The higher the percentage, the more questions you are likely to see on that content area on the exam. View video tutorials about the variety of question types on Microsoft exams.

Please note that the questions may test on, but will not be limited to, the topics described in the bulleted text.

Do you have feedback about the relevance of the skills measured on this exam? Please send Microsoft your comments. All feedback will be reviewed and incorporated as appropriate while still maintaining the validity and reliability of the certification process. Note that Microsoft will not respond directly to your feedback. We appreciate your input in ensuring the quality of the Microsoft Certification program.

If you have concerns about specific questions on this exam, please submit an exam challenge.

If you have other questions or feedback about Microsoft Certification exams or about the certification program, registration, or promotions, please contact your Regional Service Center.

Manage data
Configure and maintain a backup strategy
Manage different backup models, including point in time recovery; protect customer data even if backup media is lost; perform backup/restore based on proper strategies including backup redundancy; recover from a corrupted drive; manage a multi-terabyte database; implement and test a database implementation and a backup strategy (multiple files for user database and tempdb, spreading database files, backup/restore); back up a SQL Server environment; back up system databases
Restore databases
Restore a database secured with TDE; recover data from a damaged DB; restore to a point in time; file group restore; page level restore
Implement and maintain indexes
Inspect physical characteristics of indexes and perform index maintenance; identify fragmented indexes; identify unused indexes; implement indexes; defrag/rebuild indexes; set up a maintenance strategy for indexes and statistics; optimize indexes (full, filter); statistics (full, filter) force or fix queue; when to rebuild vs. reorg and index; full text indexes; column store indexes
Import and export data
Transfer data; bulk copy; bulk insert

Preparation resources
Back up and restore of SQL Server databases
File restores (full recovery model)
DBCC INDEXDEFRAG (Transact-SQL)

Implement security
Manage logins and server roles
Configure server security; secure the SQL Server using Windows Account/SQL Server accounts, server roles; create log in accounts; manage access to the server, SQL Server instance, and databases; create and maintain user-defined server roles; manage certificate logins
Manage database permissions
Configure database security; database level permissions; protect objects from being modified
Manage users and database roles
Create access to server/database with least privilege; manage security roles for users and administrators; create database user accounts; contained logins
Troubleshoot security
Manage certificates and keys; endpoints

Preparation resources
Server-level roles
Permissions (database engine)
Database-level roles

Implement high availability
Implement AlwaysOn
Implement a mirroring solution using AlwaysOn; failover
Implement database mirroring
Set up mirroring; monitor the performance of database mirroring
Implement replication
Troubleshoot replication problems; identify appropriate replication strategy

Preparation resources
AlwaysOn Availability Groups (SQL Server)
Microsoft SQL Server AlwaysOn solutions guide for high availability and disaster recovery
AlwaysOn architecture guide: Building a high availability and disaster recovery solution by using AlwaysOn Availability Groups

Design and implement a data warehouse
Design and implement dimensions
Design shared/conformed dimensions; determine whether you need support for slowly changing dimensions; determine attributes; design hierarchies; determine whether you need star or snowflake schema; determine the granularity of relationship with fact tables; determine the need for auditing or lineage; determine keys (business transactional or your own data warehouse/surrogate keys); implement dimensions; implement data lineage of a dimension table
Design and implement fact tables
Design a data warehouse that supports many to many relationships; appropriately index a fact table using columnstore indexes; partitioning; additive measures; semi-additive measures; non-additive measures; implement fact tables; determine the loading method for the fact tables; implement data lineage of a fact table; design summary aggregation tables

Preparation resources
Introduction to dimensions (Analysis Services – multidimensional data)
Dimension relationships
Columnstore indexes

Extract and transform data
Design data flow
Define data sources and destinations; distinguish blocking and non-blocking transformations; use different methods to pull out changed data from data sources; determine appropriate data flow components; determine the need for supporting Slowly Changing Dimensions (SCD); determine whether to use SQL Joins or SSIS lookup or merge join transformations; batch processing vs. row by row processing; determine the appropriate transform to use for a specific task; determine the need and method for identity mapping and deduplicating; fuzzy lookup, fuzzy grouping, and Data Quality Services (DQS) transformation; determine the need for text mining; determine the need for custom data sources, destinations, and transforms; determine what to do with erroneous rows; determine auditing needs; determine sampling needs for data mining; trusted/authoritative data sources, including warehouse metadata
Implement data flow
Debug data flow; use the appropriate data flow components; SQL/SSIS data transformation; create SSIS packages that support slowly changing dimensions; use the Lookup task in SSIS; map identities using SSIS Fuzzy Lookup; specify a data source and destination; use data flows; different categories of transformations; read, transform, and load data; understand which transforms to use to accomplish a specific business task; data correction transformation; performance tune an SSIS dataflow; optimize Integration Services packages for speed of execution; maintain data integrity, including good data flow
Implement script tasks in SSIS
Determine whether it is appropriate to use a script task; extend the capability of a control flow; perform a custom action as needed (not on every row) during a control flow

Preparation resources
Data flow
Slowly changing dimension transformation
Script task

Load data
Design control flow
Determine control flow; determine containers and tasks that are needed; determine precedence constraints; design an SSIS package strategy with rollback, staging, and transaction control; decide between one package or multiple packages; determine event handlers; determine variables; determine parameters on package and project level; determine connection managers and whether they are package or project level; determine the need for custom tasks; determine how much information you need to log from a package; determine the need for checkpoints; determine security needs
Implement package logic by using SSIS variables and parameters
User variables; variable scope, data type; implement parameterization of properties using variables; use variables in precedence constraints; refer to SSIS system variables; design dynamic SSIS packages; package configurations (file or SQL tables); expressions; package and project parameters; project level connection managers; implement dynamic package behavior; configure packages in SSIS for different environments, package configurations (xmlconfiguration file, SQLServer table, registry entry; parent package variables, environment variable); parameters (package and project level); project connection managers; property expressions (use expressions for connection managers)
Implement control flow
Checkpoints; debug control flow; implement the appropriate control flow task to solve a problem; data profiling; use sequence containers and loop containers; manage transactions in SSIS packages; manage parallelism; use precedence constraint to control task execution sequence; create package templates; use the execute package task
Implement data load options
Implement a full and incremental data load strategy; plan for an incremental update of the relational Data Mart

Preparation resources
Integration Services transactions
Developing a custom task
Integration Services (SSIS) parameters

Configure and deploy SSIS solutions
Troubleshoot data integration issues
Performance issues; connectivity issues; execution of a task or transformation failed; logic issues; demonstrate awareness of the new SSIS logging infrastructure; troubleshoot a failed package execution to determine the root cause of failure; troubleshoot SSIS package failure from an invalid datatype; implement break points; data viewers; profile data with different tools; batch cleanup
Implement auditing, logging, and event handling
Audit package execution by using system variables; propagate events; use log providers; log an SSIS execution; create alerting and notification mechanisms; use Event Handlers in SSIS to track ETL events and errors; implement custom logging
Deploy SSIS solutions
Create and configure an SSIS catalog; deploy SSIS packages by using the deployment utility; deploy SSIS packages to SQL or file system locations; validate deployed packages; deploy packages on multiple servers; install custom components and tasks; deploy SSIS packages by using DTUTIL

Preparation resources
Troubleshooting tools for package development
Enable package logging in SQL Server data tools
Integration Services (SSIS) logging

Build Data Quality solutions
Install and maintain Data Quality Services
Installation prerequisites; use Data Quality Server Installer; add users to the DQ roles; identity analysis, including data governance
Implement master data management solutions
Install Master Data Services (MDS); implement MDS; create models, entities, hierarchies, collections, and attributes; define security roles; import/export; subscriptions
Create a data quality project to clean data
Profile Online Transaction Processing (OLTP) and other source systems; data quality knowledge base management; create a data quality project; use Data Quality Client; improve data quality; identity mapping and deduplicating; handle history and data quality; manage data quality/cleansing

Preparation resources
Install Data Quality Services
Install Master Data Services
Master Data Services features and tasks


QUESTION 1
You administer a Microsoft SQL Server 2012 database that has Trustworthy set to on.
You create a stored procedure that returns database-level information from Dynamic Management Views.
You grant User1 access to execute the stored procedure.
You need to ensure that the stored procedure returns the required information when User1 executes the stored procedure.
You need to achieve this goal by granting the minimum permissions required.
What should you do? Choose all that apply.

A. Grant the db_datareader role on the database to User1.
B. Modify the stored procedure to include the EXECUTE AS OWNER statement. Grant VIEW SERVER STATE permissions to the owner of the stored procedure.
C. Create a SQL Server login that has VIEW SERVER STATE permissions. Modify the stored procedure to include the EXECUTE AS {newlogin} statement.
D. Move the stored procedure to the User1 schema.
E. Grant the VIEW SERVER STATE permission to User1.

Answer: B,C


QUESTION 2
You administer a SQL Server 2012 database instance.
You need to configure the SQL Server Database Engine service on a failover cluster.
Which user account should you use?

A. a domain user
B. the SQLBrowser account
C. the BUILTIN\SYSTEM account
D. a local user with Run as Service permissions

Answer: A


QUESTION 3
You administer a Microsoft SQL Server 2012 database instance.
You plan to migrate the database to Windows Azure SQL Database. You verify that all objects contained in the database are compatible with Windows Azure SQL Database.
You need to ensure that database users and required server logins are migrated to Windows Azure SQL Database.
What should you do?

A. Back up the database from the local server and restore it to Windows Azure SQL Database.
B. Use the Copy Database wizard.
C. Use the Database Transfer wizard.
D. Use SQL Server Management Studio to deploy the database to Windows Azure SQL Database.

Answer: D


QUESTION 4
You are a database administrator for a Microsoft SQL Server 2012 environment.
You want to deploy a new application that will scale out the workload to at least five different SQL Server instances.
You need to ensure that for each copy of the database, users are able to read and write data that will then be synchronized between all of the database instances.
Which feature should you use?

A. peer-to-peer replication
B. snapshot replication
C. failover clustering
D. database audits

Answer: A


QUESTION 5
Note: This question is part of a series of questions that use the same set of answer choices. An answer choice may be correct for more than one question in the series.
You administer a SQL 2012 server that contains a database named SalesDb. SalesDb contains a schema named Customers that has a table named Regions. A user named userA is a member of a role named Sales.
UserA is granted the Select permission on the Regions table. The Sales role is granted the Select permission on the Customers schema.
You need to remove the Select permission for userA on the Regions table. You also need to ensure that UserA can still access all the tables in the Customers schema, including the
Regions table, through the Sales role permissions.
Which Transact-SQL statement should you use?

A. DENY SELECT ON Object::Regions FROM Sales
B. DENY SELECT ON Schema:: Customers FROM Soles
C. REVOKE SELECT ON Object::Regions FROM Soles
D. REVOKE SELECT ON Schema: Customers FROM Soles
E. DENY SELECT ON Object::Regions FROM UserA
F. DENY SELECT ON Schema: Customers FROM UserA
G. REVOKE SELECT ON Object::Regions FROM UserA
H. REVOKE SELECT ON Schema::Customers FROM UserA
I. EXEC sp_oddrolemember ‘Sales’, ‘UserA’
J. EXEC 3p_droprolemember ‘Sales’, ‘UserA’

Answer: G


QUESTION 6
Note: This question is part of a series of questions that use the same set of answer choices. An answer choice may be correct for more than one question in the series.
You administer a SQL Server 2012 server that contains a database named SalesDb. SalesDb contains a schema named Customers that has a table named Regions. A user named userA is a member of a role named Sales.
UserA is granted the Select permission on the Regions table. The Sales role is granted the Select permission on the Customers schema.
You need to ensure that the following requirements are met:
The Sales role does not have the Select permission on the Customers schema.UserA has the Select permission on the Regions table.
Which Transact-SQL statement should you use?

A. DENY SELECT ON Object::Regions FROM Sales
B. DENY SELECT OH Schema:: Customers FROM Soles
C. REVOKE SELECT ON Object::Regions FROM Soles
D. REVOKE SELECT ON Schema:Customers FROM Soles
E. DENY SELECT ON Object::Regions FROM UserA
F. DENY SELECT ON Schema:Customers FROM UserA
G. REVOKE SELECT ON Object::Regions FROM UserA
H. REVOKE SELECT ON Schema::Customers FROM UserA
I. EXEC sp_oddrolemember ‘Sales’, ‘UserA’
J. EXEC sp_droprolemember ‘Sales’, ‘UserA’

Answer: D

Click here to view complete Q&A of 70-458 exam
Certkingdom Review

MCTS Training, MCITP Trainnig

Best Microsoft MCTS Certification, Microsoft 70-458 Training at certkingdom.com