Number of questions: 67 Number of questions to pass: 48 Time allowed: 90 mins Status: Live
This intermediate level certification is intended for Application Developers responsible for the development of integration services for business process applications. This certification focuses on application development with IBM Integration Designer V19.0 for deployment on IBM Business Automation Workflow V19.0.
This exam does not include IBM Process Designer or Process and Case Modeling.
This exam consists of eight sections described below. For more detail, please see the study guide on the Exam Preparation tab.
Installation and Configuration 4% Install and update the IBM Integration Designer (IID) Install and configure a Unit Test Environment (UTE)
Service Component Architecture (SCA) programming model and solution design 22% Design and use Service Component Architecture (SCA) Design and use business objects Demonstrate an understanding of Service Component Architecture (SCA) Effectively organize a solution into modules, mediation modules, and libraries taking into consideration component reuse, and application maintainability Determine the appropriate use of macroflows (long-running processes), microflows (short-running processes), and mediations Effectively use quality of service (QoS) qualifiers Demonstrate understanding of and apply performance considerations for business integration solutions, including long-running processes Configure dynamic invocation using Gateway patterns Monitor Business Processing using Dynamic Event Framework (DEF) and audit logging
BPEL Development 15% Design and implement Business Process Execution Language (BPEL) processes using the business process editor Use correlation sets in the BPEL process Demonstrate understanding of transaction behavior Implement custom logic using the visual snippet editor and Java code Implement error handling and compensation within a business process Demonstrate an understanding of working with Human tasks Create new versions for the BPEL process
Mediation Development 15% Describe the Service Message Object (SMO) Implement fault handling in mediation modules Develop mediation flows Use mediation primitives and subflows in mediation flows Transform data using maps (XSLT and Business Object) Use dynamic service routing through a Dynamic Endpoint Selection Design a parallel flow (fan in/fan out)
Workflow Center Repository 11% Working with the Workflow Center Perspective Import Process Application and Toolkits Manage artifacts in the repository (associating, disassociating and merging) Implement advanced integration service (emulating) Understand design considerations when working with Workflow Center Repository integration modules
Connectivity and Integration 13% Use and configure technology adapters, including the Java Database Connectivity (JDBC), FTP, email, Enterprise Content Management (ECM) and flat file adapters Configure import and export bindings (for example, JMS, MQ, Web Services, https:, and SCA) Demonstrate an understanding of different SCA invocation styles between synchronous, asynchronous using one-way operations, asynchronous with callback, and asynchronous with deferred response
Packaging and Deployment 9% Generate unmanaged integration module deployment packages Apply security to SCA application Understand the use of shared library Use Software Configuration Management (SCM) system with Integration Designer
Testing and Troubleshooting 11% Test business integration solutions using component tests Configure and use the integration test client tool to test components Use Business Process Choreographer (BPC) Explorer for testing and troubleshooting long-running processes tasks Use appropriate server logs and cross component trace (XCT) for problem determination Use the integration debugger to debug business integration components Demonstrate an understanding of Failed Event Manager (FEM) and recovering of events
Number of questions: 60 Number of questions to pass: 42 Time allowed: 90 mins Status: Live
An IBM Certified Administrator on IBM Db2 12 for z/OS is the lead database administrator for the Db2 product on the z/OS operating system. This individual has significant experience as a DBA and extensive knowledge of Db2, up to and including Db2 12 for z/OS Function Level 506. This individual is capable of performing intermediate to advanced tasks related to database design and implementation, operation and recovery, security and auditing, distributed connectivity, performance, installation and migration/upgrades specific to the z/OS operating system.
This exam consists of 7 sections described below. For more detail, please see the study guide on the Exam Preparation tab.
Section 1: Database Design and Implementation 22% Design tables and views Explain the different performance implications of identity column, ROWID, and sequence column definitions Design indexes Design universal table spaces Perform partitioning Normalize data (E-R model, process model) and translate data model into physical model Understand and describe Db2 database enforced integrity rules Use the appropriate method to alter Db2 objects Explain the use and impact of different encoding schemes
Section 2: Operation and Recovery 20% Explain and execute commands and subcommands for normal operational conditions Explain and execute commands and utility control statements for use in abnormal conditions Load and unload data into and from Db2 tables Reorganize objects when necessary Monitor objects by collecting statistics Monitor and manage threads and utilities Identify and respond to advisory/restrictive statuses on Db2 objects Identify and perform problem determination Perform object and catalog/directory health checks Identify and perform actions needed to protect databases from planned and unplanned outages Identify and perform actions needed to recover data to a consistent point in time
Section 3: Security and Auditing 10% Explain privileges and authorities Protect access to Db2 and its objects Audit Db2 activity and resources and identify primary audit techniques Identify and respond appropriately to symptoms from trace output or error messages that signify security problems Explain and use roles, trusted context, data masking and encryption
Section 4: Performance 21% Plan for performance monitoring by setting up and running monitoring procedures Analyze performance (manage and tune CPU requirements, memory, I/O, locks, response time, index and table compression) Analyze and respond to RUNSTATS statistics analysis Determine when and how to perform REBIND Describe Db2 interaction with Workload Manager (WLM) Review and tune SQL Monitor dynamic SQL performance Design features for performance Explain how to control and influence access paths
Section 5: Installation and Migration 9% Recommend settings for critical Db2 system parameters Identify and explain Db2 data sharing components and commands Describe use of premigration checklists Describe the function of the catalog and directory Demonstrate understanding of the Db2 12 continuous delivery process Explain the migration process and activation of new function levels
Section 6: Additional Database Functionality 9% Explain and use SQL constructs Explain the use of stored procedures Explain the use of SQL/XML and XQuery Explain the use of user-defined functions Explain and use global variables Describe the use of triggers, merge statement, pagination options and JSON support
Section 7: Distributed Access 9% Demonstrate understanding of distributed data access and the role of the communications database Demonstrate understanding of Distributed Data Facility (DDF) implementation options Recommend values for critical DDF configuration settings Understanding and implementing distributed data access Describe Db2 support for Native REST services and differentiate from RESTful API
Number of questions: 60 Number of questions to pass: 41 Time allowed: 90 mins Status: Live
IBM Cloud Object Storage V3.14 Specialist tests an individual for knowledge of concepts and theory of IBM Cloud Object Storage V3.14. The test covers in-depth knowledge of the basic to intermediate tasks required in day-to-day use of these products.
IBM Cloud Object Storage V3.14 Specialist 100% Section 1 – IBM Cloud Object Storage Concepts Provide a high-level overview of the architectures for deploying IBM Cloud Object Storage. Describe the role of a Manager, Accesser, and Slicestor in a IBM Cloud Object Storage network. Describe the background processes in IBM Cloud Object Storage. Describe use cases for Concentrated Dispersal mode (CD mode). Explain how choices of IDA affects the capacity and resiliency of IBM Cloud Object Storage system. Explain object versioning in IBM Cloud Object Storage. Describe how security technology is used in IBM Cloud Object Storage. Describe features in IBM Cloud Object Storage and the component. Describe device expansion, migration, and replacement. Describe role-based access; Authentication, Authorization, LDAP/AD integration; Keystone; Access/Secret Key; Appliance role-based access. Describe retention/compliance . Determine if or when to use Container mode.
Section 2 – Installation and Administration Install IBM Cloud Object Storage. Configure System and Associated Accessors and Slicestors. Create Cloud Object Storage – Storage Pools. Create Access Pools. Create Vaults and Vault Templates. Understand the concept of vault mirrors and how to create them. Create and Manage Users in Vault mode. Configure Administration Preferences. Back up and Restore System Manager. Configure System Security. Section 3 – Monitoring and Troubleshooting Monitor Device and Drive Health. Recover and Replace Failing Devices and Drives. View and export Reports from the System Manager. Use graphs and the event console for planning and troubleshooting purposes. Create and configure alerts. SMTP, SNMP, Syslog Forwarding. Collect logs.
Number of questions: 63 Number of questions to pass: 45 Time allowed: 90 mins Status: Live
An IBM Certified Associate – IBM Tivoli Netcool/OMNIbus V8.1 is an individual with entry level knowledge and experience with Netcool/OMNIbus V8.1. This individual is knowledgeable about the fundamental concepts of Netcool/OMNIbus V8.1 through either hands on experience or formal and informal education. The associate should have an in-depth knowledge of the basic to intermediate tasks required in day-to-day use of Netcool/OMNIbus V8.1.
Key Areas of Competency: 1. Explain the basic architecture of IBM Tivoli Netcool/OMNIbus V8.1. 2. Configure and use IBM Tivoli Netcool/ OMNIbus V8.1 Web GUI within the DASH Portal
3. Describe the options for visualizing IBM Tivoli Netcool/OMNIbus V8.1 events.
Basic Architecture 21% Explain the basic function of OMNIbus probes in the architecture Explain the basic function of OMNIbus gateways in the architecture Explain the basic architecture of the Web GUI Explain the basic architecture of OMNIbus Explain the basics of event handling in the Object Server
UNIX , Linux and Windows Environment 10% Configure the Netcool/OMNIbus communications file (using the CLI) so that Netcool/OMNIbus can be run on a UNIX/Linux platform Configure the Netcool/OMNIbus communications file (using the GUI) so that Netcool/OMNIbus can be run on a UNIX/Linux platform Configure the Netcool/OMNIbus communications file so that Netcool/OMNIbus can be run on a Windows platform
New features released in OMNIbus 8.1 14% Understand the Event Viewer feature Describe the differences between the Event Viewer and the Active Event List Understand the use of the initial configuration wizard (nco_icw) Understand the difference between lightweight and Java-based configuration modes Understand the use of the Data Source configuration User Interface (UI)
Native OMNIbus Tools 17% Access and configure the Native Event List so that events are filtered and visible in a predefined format Run basic SQL commands so that ObjectServer information is displayed Access and use the OMNIbus Native Administrator client
Web GUI Portal 38% Configure DASH roles to permit user access to Web GUI and/or Web GUI administration Customize the Event Dashboard to present different monitor views Manage the display of events in the Event Viewer Manage alerts with the Event Viewer Create and configure Web GUI Filters Create and configure Web GUI Views Create and configure Web GUI tools Create and configure Web GUI prompts Create and configure Web GUI menus Use the Quick Filter feature in the Event Viewer to limit the events being displayed Display additional information about an event (journal entries, details, or undisplayed columns) Modify the widgets in the portal Understand OMNIbus Web GUI Mobile Applications
Number of questions: 60 Number of questions to pass: 42 Time allowed: 90 mins Status: Live
An IBM Certified Solution Architect – Cloud Pak for Data V2.5 is a person who can design, plan and architect a Cloud Solution. They can do this with limited assistance from support, documentation, and/or relevant subject matter experts.
During exam development, the Subject Matter Experts (SMEs) define all of the tasks, knowledge and experience that an individual would need in order to successfully fulfill their role with the product or solution. The exam item writers use these objectives to develop the question that will appear on the exam.
Section 2: Cloud Pak for Data Architecture 31% Understanding Licensing / entitlement and controls Understanding Cloud Pak for Data Reliability Architecture Understanding how to secure the system and client data Understand Cloud Pak for Data Services architecture
Section 3: Planning 24% Understanding available Services for architecting a solution Identify networking and operation system requirements needed by a solution Identify key requirements and considerations for designing a solution
Section 4: Cloud Pak Data Services and Functionality 23% Describe ways to collect data Describe how DataOps helps to ingest, organize and govern data Understand the available analytics and AI functionality Understand the purpose of the Industry Accelerators
The sample test is designed to give the candidate an idea of the content and format of the questions that will be on the certification exam. Performance on the sample test is NOT an indicator of performance on the certification exam. This should not be considered an assessment tool.
Sample Test for Test C1000-066 Use the study guide to help pass this exam. A study guide is an easy to follow document that will help you prepare for this exam. The guide is free and can be downloaded immediately.
Study Guide PDF here This exam has an Assessment Exam option: A1000-066 Assessment: IBM Cloud Pak for Data Solution Architect V2.5
Assessment exams are web-based exams that provides you, at a cheaper costs, the ability to check your skills before taking the certification exam.
This assessment exam is available in: English Passing the exam does not award you a certification, and it is only used to help you assess if you are ready or not to take the certification exam. You can register for it at Pearson VUE and it will provide you a score report, showing you how you did in each section.
Learning Path
Solution Architect: IBM Cloud Pak for Data Build skills to discover the use of the IBM Cloud Pak for Data platform, build foundational knowledge and expand to more advanced topics.
Openshift Technical Review OpenShift Container Platform Technical Overview. High level introduction of the components that make up OpenShift container Platform
Web Based Training
Containers and Kubernetes Essentials with IBM Cloud Get hands-on experience with Kubernetes container orchestration. Learn how Kubernetes and IBM Cloud Kubernetes Service help you more easily deploy and scale containers and applications.
Cloud Pak for Data Community Engage with the other members of your community to better use the product to collect, organize, and analyze your data to accelerate the value of data science and AI in your own environment.
OpenShift Technical Overview OpenShift Container Platform Techincal Overview. High level introduction of the components that make up OpenShift container Platform
Cloud Pak for Data – Foundations (eLearning class) This learning offering will tell a holistic story of Cloud Pak for Data including collaboration across an organization, which is key in this platform. Applicable to all personas. Multiple use cases will provide understanding of how organizations can benefit from Cloud Pak for Data. A variety of features will also be explored, providing students with the insight on how to use the platform.
DataOps Manifesto Through firsthand experience working with data across organizations, tools, and industries we have uncovered a better way to develop and deliver analytics that we call DataOps.
Accessing Remote tables IBM Cloud Private for Data provides a secured and governed analytics platform that does not require you to move data. It is a common framework across multiple clouds to deploy analytic services for data collection, governance and data science.
Watson Machine Learning IBM Watson Machine Learning is an IBM Cloud service that’s available through IBM Watson Studio.
Watson Studio IBM Watson Studio, part of IBM Watson, can help you increase productivity by giving your team a single environment to work with the best of open source and IBM software, to build and deploy an AI solution.
Open Shift Architecture OpenShift v3 is a layered system designed to expose underlying Docker-formatted container image and Kubernetes concepts as accurately as possible, with a focus on easy composition of applications by a developer.
Watson Knowledge Catalog IBM Watson Knowledge Catalog is part of IBM Watson. It’s a secure enterprise catalog to index, classify, and govern your data wi
Cloud Oak for Data Announcement letter. v2.5 IBM Cloud Pak for Data V2.5 delivers data and AI platform for hybrid multicloud on Red Hat OpenShift, featuring open source governance and AutoAI
OpenScale IBM Watson OpenScale is the open platform to operate and automate AI across its lifecycle. With Watson OpenScale, eliminate barriers to enterprise-scale AI, increase business confidence in outcomes, and operationalize your AI, on any cloud.
OpenShift SubSystems Our Interactive Learning Scenarios provide you with a pre-configured OpenShift® instance, accessible from your browser without any downloads or configuration. Use it to experiment, learn OpenShift and see how we can help solve real-world problems.
Data Virtualization (eLearning class) This learning offering will focus on the Data Virtualization add-on to IBM Cloud Pak for Data. Students will learn about the Data Virtualization technology and be able to create connections to remote data sources.
Decision Optimization (eLearning class) IBM Decision Optimization allows you to run optimization models in IBM Cloud Private for Data, with a user-friendly environment in which you can combine optimization with data science.
Cognos Analytics: Dashboarding and Reporting (eLearning class) This offering provides analysts with an introduction to create dashboards with Cloud Pak for Data, and create dashboards and reports using the Cognos Analytics add-on.
Cloud Pak for Data Knowledge Center IBM® Cloud Pak for Data is a native cloud solution that provides a single unified interface for your team to connect to your data no matter where it lives, govern it, find it, and use it for analysis.
SPSS Modeler Foundations (eLearning class) SPSS Modeler is one of the add-on modules on IBM Cloud Pak for Data. This course reviews the basics of how to import, explore, and prepare data, and introduces the student to machine learning models with SPSS Modeler on Cloud Pak for Data.
Industry Accelerators The industry accelerators that are provided by IBM are a set of artifacts that help you address common business issues.
Review Cloud Pak for Data on a 7-14 day trial Explore this flexible multicloud data platform with free access to a hosted environment. This experience includes a guided journey where you will learn how to collect, organize, and analyze your data to build AI-powered applications.
Streaming Analytics Perform real-time analysis on data in motion as part of your IBM Cloud applications by using IBM® Streaming Analytics for IBM Cloud.
Networking with OpenShift Enterprise 3.1 Learn how OpenShift Enterprise 3.1 by Red Hat provides powerful networking and gives your applications and services dynamic availability.
OpenShift Playground (Experiment with OpenShift) This is a playground for trying out OpenShift 3.11. From here you can play with OpenShift using the web console or command line. Web Resource
Cloud Pak for Data – Dashboards The badge earner is familiar with creating analytics dashboards to visualize data in IBM Cloud Pak for Data and knows how to share dashboards with colleagues in their enterprise. The individual can use templates to easily arrange and align their dashboard content and can employ different kinds of visualizations and maps in their dashboards. The badge earner is familiar with using widgets in dashboards, and can also filter, sort, calculate, and format data in dashboards.
Cloud Pak for Data – Modeling The successful badge earner is familiar with the platform and architecture of IBM Cloud Pak for Data, and demonstrates a comprehension of the workflow, projects, models, scripts, and jobs. This individual understands the machine learning environment, can deploy analytical models, and perform basic administrative functions. The badge earner is familiar with the tasks of the various personas, with a special emphasis on tasks performed by the data scientist in a real-world, working environment.
Cloud Pak for Data Essentials The successful badge earner can demonstrate comprehension for the platform and architecture of IBM Cloud Pak for Data, and the workflow and collaboration between the personas. The earner can access the various supported data sources, and can catalog, govern, and perform extract, transform, and load (ETL) on that data. The badge earner can also perform basic administrative tasks, set up projects, and analyze the data.
Cloud Pak for Data – Data Governance The badge earner is familiar with the enterprise information in IBM Cloud Pak for Data. This individual can logically structure enterprise information, discover relationships between assets, and ensure that the data is perpetually current. This individual understands how to create and import a data dictionary, edit and govern assets, and use collections. The badge earner is familiar with the tasks of the various personas, specifically those involved within the workflow.
Cloud Pak for Data – Data Access and Transformation The badge earner is familiar with the platform and architecture of IBM Cloud Pak for Data, and demonstrates a comprehension of data pipelines, connecting data sources, and creating data views. This individual can manipulate catalog metadata, configure data sources, automate business rules, and prepare data for analysis. The badge earner is familiar with the tasks of the various personas, with a special emphasis on tasks performed by the data engineer in a real-world, working environment.
Number of questions: 65 Number of questions to pass: 43 Time allowed: 90 mins Status: Live
An IBM Certified Deployment Professional – IBM Tivoli Network Manager V4.2 is a technical professional responsible for the planning, deployment, administration, and troubleshooting of IBM Tivoli Network Manager V4.2. This individual will be expected to perform these tasks with limited assistance from peers, product documentation, and support resources.
Key Areas of Competency Describe the IBM Tivoli Network Manager V4.2 architecture Design an IBM Tivoli Network Manager V4.2 architecture Install and configure IBM Tivoli Network Manager V4.2 Administer IBM Tivoli Network Manager V4.2 Troubleshoot IBM Tivoli Network Manager V4.2
Required Prerequisite Skills: Knowledge of OMNIbus and DASH/JazzSM, TCR Knowledge of networks and network management Knowledge of SQL Use of scripting languages Basic Systems Administration skills
Planning 20% Gather deployment requirements Determine implementation architecture Create a detailed deployment plan
Installation 26% Configure user authentication method Create and configure databases Use the IBM Installation Manager to deploy necessary components (GUI/CLI) Configure integration to ObjectServer(s) Configure failover Install Network Health Dashboard Install ITNM reports in Tivoli Common Reporting Configure automatic starting of domains and components
Administration 34% Configure Individual User or Group Access Apply Fix Packs Administer discovery domains Configure network discovery Create or modify Active Object Class (AOC) files Configure Network Polling Administer Event Flow Customize ITNM GUIs Customize ITNM Discovery Configure Network Reports Use Network Health Dashboard Configure managed network device status Configure specialized discovery Administer MIBs
Troubleshooting 20% Understand, maintain and access the dNCIM database Use OQL to query ITNM Confirm component communications Troubleshoot discovery issues Specify logging and trace levels Locate and utilize diagnostic tools
Number of questions: 61 Number of questions to pass: 41 Time allowed: 90 mins Status: Live Used to earn the IBM Certified Solution Architect – Watson IoT Maximo V1 certification
Section 1: Basic Architecture 15% Define the basic architecture of Maximo Enterprise Asset Management/Asset Performance Management (EAM/APM) components (functional vs technical) Describe the deployment options (on-prem, Cloud/SaaS, hybrid – including Kubernetes) Describe multi-language support in Maximo (language tables, configuring translations, loading translations, etc.) Describe the Maximo Architecture Solution view and data flows Describe the Maximo Enterprise Asset Management/Asset Performance Management (EAM/APM) mobility architecture
Section 2: Maximo Standard Functionality 22% Explain the key concepts of inventory management Explain the key concepts of Enterprise Asset Management/Asset Performance Management (EAM/APM) Explain the key concepts of procurement in support of work and inventory (asset) management Explain the key concepts of Contract Management in support of work and asset management Explain the key concepts of Work Management Enterprise Asset Management (EAM) Describe the 5 models available on Maximo Asset Performance Management (APM) on Cloud
Section 3: Maximo Industry Solutions/Add-ons 14% Describe the mobility solutions for Maximo Enterprise Asset Management (EAM) Describe the Maximo Scheduling Solutions Describe the Industry Solutions – New Models Describe the process of adding New Models to Maximo Asset Performance Management (APM)
Section 4: Reporting/Analysis 13% Discuss scoring and how it affects Work Management Discuss predictive and AI technology and how it affects work Discuss KPI and Work Centers and how it impacts Work Management Discuss reporting both BIRT/Cognos and how it affects Enterprise Asset Management/Asset Performance Management (EAM/APM)
Section 5: Security 11% Describe Maximo authentication methods Explain Maximo Security Groups and their configuration Explain Maximo security controls Describe how a single security asset management role is achieved by Maximo Enterprise Asset Management/Asset Performance Management (EAM/APM) Describe Maximo Cloud security options
Section 6: Initial Setup 11% Explain the purpose of Maximo’s multi-org/multi-site capabilities Understand the General Ledger (GL) Account Structure and code usage in a Multi-Organization site configuration Summarize other initial setup requirements for Maximo based on customer usage
Section 7: Customization/Configuration 14% Explain the Maximo integration for Asset Performance Management (APM) Explain concepts of Maximo Formulas and their usage Explain the benefits of using Automation Scripting Understand the tools available for the Maximo UI framework for configuration and customization Explain when to use Migration Manager
Number of questions: 62 Number of questions to pass: 45 Time allowed: 90 mins Status: Live This intermediate-level certification is intended for developers who are responsible for developing, publishing, configuring, and managing APIs using IBM API Connect 2018.x. This test also covers administration and scripting topics as well but will not cover installation.
This exam consists of four sections described below. For more detail, please see the study guide on the Exam Preparation tab.
Architectural Overview of IBM API Connect 20% Articulate the architectural requirements to support a given IBM API Connect topology Compare the different deployment options Differentiate between the spaces and two types of organizations Demonstrate the various stages in the lifecycle of an API, including Create, Run, Manage, Secure, Test, and Monitor Distinguish between the various roles involved in the lifecycle of an API Implement the OpenAPI specification +Identify typical use cases across industry
Cloud/API Manager Role 21% Configure and manage the IBM API Connect cloud components +Manage the IBM API Connect Cloud using the REST interface Use the IBM API Connect Command Line Interface Backup and restore IBM API Connect configuration data Backup and restore APIs and Products Analyze logs to identify problems within the IBM API Connect Cloud Secure the IBM API Connect Cloud Integrate with an external user registry Configure the API Gateway extensions Manage IBM API Connect catalogs
API Developer Role 27% Create and configure a SOAP API Create and configure a REST API Apply a security definition to an API Leverage API assembly components Use the Unit Testing tools to test APIs Implement user-defined policies Manage error handling Utilize API properties Use the IBM API Connect Developer Toolkit Command Line Interface
Product Manager Role 16% Distinguish between the various lifecycle stages of APIs and Products Gain business insight from analytics information Show the relationship between Products and Plans and APIs Design Products and Plans Administer Consumer Access
Developer Portal 16% Distinguish between the various lifecycle stages of APIs and Products Gain business insight from analytics information Show the relationship between Products and Plans and APIs Design Products and Plans Administer Consumer Access
The sample test is designed to give the candidate an idea of the content and format of the questions that will be on the certification exam. Performance on the sample test is NOT an indicator of performance on the certification exam. This should not be considered an assessment tool.
Use the study guide to help pass this exam. A study guide is an easy to follow document that will help you prepare for this exam. The guide is free and can be downloaded immediately.
This exam has an Assessment Exam option: A1000-044 Assessment: IBM API Connect 2018.x Solution Implementation
Assessment exams are web-based exams that provides you, at a cheaper costs, the ability to check your skills before taking the certification exam.
This assessment exam is available in: English
Passing the exam does not award you a certification, and it is only used to help you assess if you are ready or not to take the certification exam.
You can register for it at Pearson VUE and it will provide you a score report, showing you how you did in each section.
All IBM certification tests presume a certain amount of “on-the-job” experience which is not present in any classroom or Web presentation. The recommended courses and links will help you gain the skill and product knowledge represented in the test objectives. They do not teach the answers to the test questions and are not intended to do so. This information may not cover all subject areas in the certification test or may contain more recent information than is present in the certification test. Taking these or any classes will not guarantee that you will achieve certification.
Learning Path Solution Developer: IBM API Connect Build skills to to help you create developer communities to publish and share APIs and engage with them through a self-service portal.
Number of questions: 69 Number of questions to pass: 42 Time allowed: 90 mins Status: Live This exam consists of 5 sections described below.
Section 1: Planning Review the customer’s environment Determine client backup methodology based on client data Explain current IBM Spectrum Protect continuous development cycle Discuss the different licensing models Explain Cloud Tiering: Limitations, Troubleshooting, and Possible Future Enhancements
Section 2: Installation Install the IBM Spectrum Protect server software and create the server instance +$Register the IBM Spectrum Protect server license Install administrative client command line Install and configure the backup-archive client Define storage hardware devices Install and configure IBM Spectrum Protect Operations Center Install and configure IBM Spectrum Protect HSM or IBM Spectrum Protect for Space Management on the Spectrum Protect Backup/Archive clients. (Windows and UNIX) Install and configure Spectrum Protect for Virtual Environments Install and configure the IBM Spectrum Protect Snapshot solution as provided within the IBM Spectrum Storage software suite Set up user authentication in the IBM Spectrum Protect server using LDAP Set up secure communications Install and configure Spectrum Protect for Mail and Databases
Section 3: Configuration Define Tape Libraries Set up traditional storage pools Set up directory container storage Set up cloud container storage pools Define a copy pool/active data pool Set up server-to-server communications Create Server pair to allow Storage pool protection and node replication Set up Policies Define Client Schedules Set up client-side data deduplication Identify, configure and control the tape volume lifecycle in IBM Spectrum Protect Install, configure and manage the IBM Spectrum Protect server/client in a clustered environment
Section 4: Administration Define storage pool reclamation, migration, back up and collocation Define data expiration Automate administrative processes by admin scheduling and IBM Spectrum Protect scripting Set up levels of administrative privileges Back up the IBM Spectrum Protect database Manage client options and client option sets Establish include/exclude lists Set up the Client Acceptor Daemon (CAD) Configure and manage the IBM Spectrum Protect options related to the server groups +$Configure the IBM Spectrum Protect server and client options to enable the possibility to restore or retrieve files from one workstation to another Set up NAS backups using NDMP, SMTP (SnapMirror) or SnapDiff Manage container storage pools Manage Alerts Transfer data to another tape technology Set up custom reports in the Operations Center Set up server options Convert traditional pools to container storage Configure Cloud Tiering using the Command Line or the Operations Center Describe Operations Center Security Dashboard Updates
Section 5: Problem Determination Monitor alerts in the Operations Center Perform problem determination on the server Perform library, volume, and container audits to identify problems Repair problems in libraries, volumes, and containers Identify environment bottlenecks (network, storage, SAN, memory, server hardware) Perform problem determination on the Spectrum Protect server database Perform problem determination on the client Configure and use Always ON Server Monitor Monitor replication status using replication failure summary
Number of questions: 60
Number of questions to pass: 35
Time allowed: 90 mins
Status: Live
This exam consists of 4 sections described below.
Customer Environment, Requirements, and Plans 33% Describe a customer’s hardware and software environment.
Describe customer operational contraints, including power, cooling, personnel,
knowledge level, and service level requirements.
Identify disaster recovery and high availability requirements.
Identify customer performance requirements, including throughput and latency.
Identify the customer’s capacity and growth requirements and the impact on total
cost.
Product Information 30% Given a scenario, describe the difference(s) among various storage media and
align them to a requirement of TCA, performance, and upgradability.
Identify the difference(s) in cost, performance, and reliability among IBM
storage solutions.
Describe IBM’s competitive advantages.
Given a scenario, describe how IBM enterprise storage products help clients
solve data and device management issues through application efficiency and
integration.
Given a scenario, explain alternatives to the existing environment to provide
additional functionality.
Application of Resources and Tools 20% Given a scenario, evaluate which tool(s) to use based upon key applications,
performance requirements, customer pain-points, capacity and growth
requirements, and the impact on total cost.
Identify uses of IBM maintenance, IBM websites, IBM tools, and IBM processes.
Given a scenario, identify high-level steps to integrate new solutions into an
existing systems environment.
Containers, Clouds, and Analytics 17% Identify ways the IBM storage porfolio enables containers and clouds.
Identify ways the IBM storage porfolio enables analytics.
Identify ways the IBM storage porfolio provides security.
Overview PartnerWorld Code: C0005007
Replaces PW Code: 23003306
Status: Live
The IBM Storage Technical Specialist consults with customers to gather and
understand customer requirements and position the correct solutions. The
specialist uses available resources to design the solution. The specialist has a
broad knowledge of the features, functions, and benefits of IBM storage
solutions and the IBM storage portfolio, including disk, tape, storage
management software, and software defined storage. Successful candidates
understand competitive offerings used in the enterprise segment.
The IBM Storage technologies and solutions included in this exam are: IBM Flash Storage
All flash solution concepts
Hybrid storage solutions including IBM System Storage DS8800 and IBM FlashSystem
Software-defined storage including the IBM Spectrum Suite; IBM Spectrum
Accelerate, IBM Spectrum Control, IBM Storage Insights, IBM Spectrum Virtualize
and IBM Spectrum Scale
IBM tape solutions and technologies
IBM Spectrum Protect
IBM Spectrum Protect Plus
IBM Spectrum Archive
Cloud Object Storage
SAN/Networks
VersaStack and converged infrastructure
General knowledge of hybrid cloud and analytics concepts and the IBM Storwize
family of products is recommended.
This specialist can perform the following tasks without assistance:
Determine product positioning
Gather customer requirements
Identify and engage correct resources
Identify competitors
Provide the technical details for a TCO analysis
Present the IBM storage product line to the customer
Articulate features and benefits of IBM storage solutions
Read and understand a configuration based on customer requirements
Develop basic technology topologies and architectural designs
Present the configuration to the customer
Initiate the TDA process
Identify the correct tool to size the solution for performance
Identify open system, mainframe and IBM i connectivity requirements
Provide information gathered during customer interaction for post-installation
This specialist can perform the following tasks with assistance: Respond to an RFP within the enterprise storage scope
Mitigate competitors
Demonstrate deep technical knowledge
Complete detailed design of the total solution
Conduct the IBM Technical Delivery Assessment (TDA)
Identify risks associated with the proposed solution
Plan implementation of the solution as part of the pre-install TDA
Submit an RPQ/SCORE request
Conduct a demonstration of the storage solution
This specialist should be familiar with the following resources: eConfig
TCOnow!
IBM PartnerWorld
Sizing tools such as Disk Magic, Capacity Magic and Batch Magic
IBM Comprestimator Utility
Storage Consolidation Evaluation tool hosted by Alinean
System Storage Interoperability Center (SSIC)
Cognitive Storage Analytics
IBM Economics Study
IBM Data Reduction Estimator Tool
Recommended Prerequisite Skills Minimum 12 months of direct technical storage experience, with six months
experience with IBM designated storage products.
QUESTION 1
A customer with a highly virtualized VMware environment and several bare-metal Oracle servers is interested
in data protection and replication between its on-site and IBM cloud environments.
Which technology should be discussed?
A. IBM Spectrum Protect Extended Edition
B. IBM Spectrum Protect for Virtual Environments
C. IBM Spectrum Protect Snapshot
D. IBM Spectrum Protect Plus
Answer: B
Section: (none)
Explanation
Explanation/Reference:
Reference: https://www.ibm.com/downloads/cas/L9MD4MEZ
QUESTION 2
A customer has an existing IBM XIV Gen3 solution with 3 TB drives and nine modules. While performance is
acceptable, the customer wants to increase the performance of small-block reads. The current nine modules
are running at 60% of capacity.
Which solution should the technical specialist recommend?
A. Add an IBM FlashSystem 900 as a separate pool.
B. Add the SSD caching option on five modules.
C. Add the SSD caching option on all nine modules.
D. Add three more data modules to improve performance.
QUESTION 3
A customer has IBM and NetApp block and file storage. The customer is interested in reducing storage costs
and optimizing its data center through intelligent capacity planning, storage reclamation, storage tiering, and
performance metrics.
Which IBM SDS offering should the technical specialist discuss with the customer?
A. IBM Spectrum Protect
B. IBM Spectrum Control
C. IBM Spectrum Insights Pro
D. IBM Virtual Storage Center
Answer: C
Section: (none)
Explanation
Explanation/Reference:
Reference: https://www.ibm.com/support/knowledgecenter/en/SSQRB8/com.ibm.spectrum.si.doc/
prd_ovw_versions.html
QUESTION 4
A customer has a requirement for 250,000 IOPS and 200TB of capacity. The customer requires IBM SSR
code upgrades at no additional charge.
Which solution should the technical specialist recommend to the customer?
A. IBM FlashSystem A9000 with 3.6TB MicroLatency modules.
B. IBM FlashSystem V9000 with IBM Storwize V5010 with 1.2TB drives
C. IBM Storwize V7000 with 1.2TB drives and 800 GB SSDs
D. IBM DS8886 with 1.2TB drives and HPFE
Answer: D
Section: (none)
Explanation
Explanation/Reference:
QUESTION 5
Which exclusive feature of the IBM FlashSystem 9100 is a technical advantage over competitive storage
arrays?
A. 3D NAND SATA SSD drives
B. NVMe SSD drives
C. XL Flash modules
D. FlashCore modules