Category Archives: Dell

Prevent, Fix Windows Explorer Crashes

Windows 7 has sold more than 30 Million copies, as per Microsoft. Its evident that it had been a big success bringing great improvements over windows XP.

Glitches would never go away till it’s a software. And they can be annoying at times, but there are ways you can minimize the annoyances and prevent havocs. One such problem is with Windows Explorer who like to act weird with its “Windows Explorer has stopped responding” “Windows explorer has misbehaved and needs to be closed”, blah blah.

 

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

 

You can always force close it but The Explorer process runs a lot of important things in Windows, and when it crashes, it can really slow you down.

But there is a solution to minimize the damage by sandboxing its process, so that when next time one crashes, others don’t in the fission.

The Solution to windows Explorer crashes

Windows Folder options actually let you create a separate process for the taskbar and desktop from the other open Explorer windows in two different explorer.exe processes (that you see in task manager). This comes handy when one of them crashes.

HowTo: Go to any Windows Explorer (say my computer), Alt+T  > Tools > Folder Options > View > “Launch Folder Windows in a Separate Process”.

windows-7-explorer-crash

If you look more closely, its something that powers Google Chrome browser, new process for every Tab. Instead, there are just two processes. Of course it would be slightly more CPU intensive, but that’s definitely worth it.

Control Windows 7 with Kinect

Xbox Kinect is the ultimate hack machine, and that’s it the most popular gadget of the year 2010. Kinect has often been hacked to work with a large number of hobby electronics to attain Gesture based control for systems. It has been hacked to control Widnows  7 PC a number of times, but it was never done and sold like a regular software bundle.

 

Microsoft MCTS Certification, MCITP Certification and over 2000+ Exams at Actualkey.com


Win&I is the first fully useful commercial software that let’s you control your windows 7 PC with a Kinect controller using gestures alone.

You can control Windows 7 and thousands of applications with this natural user interface. WIN&I software replaces the computer mouse by tracking simple gestures from users up to several meters from the screen using the power of the Kinect depth-sensor.

You just have to Connect your Kinect with your PC and run the WIN&I control session to control windows 7, And leave the rest for Kinect to manipulate.

Android tablet on the cheap: Acer Iconia A500 is the XOOM “Light”

Acer’s Iconia A500 tablet features the same exact hardware specifications as Motorola’s XOOM, but with half the flash storage and $150 cheaper.

Back in early March, I proposed that Motorola create a new SKU for their XOOM tablet with a slightly reduced feature set, which I tentatively called the “XOOM Light”.

 

 



Microsoft MCTS Certification, MCITP Certification and over 2000+ Exams at Actualkey.com

 

If you don’t remember the piece, here’s the crux of the argument:

If iSuppli’s BOM is to be believed to be within striking range of actual component costs, then Motorola can shave anywhere between 50 and 60 dollars off their manufacturing cost on the XOOM by slicing the flash storage in half to 16GB and going with less expensive camera parts, putting it closer on par with Apple’s iPad 2. This would allow the XOOM to retail for about $50-$80 less than the entry-level iPad 2, depending on how close the company wants to cut their margins.

The only way XOOM is going to be a repeat hit for Motorola Mobility in the same way the Droid was for their handset business is to undercut the iPad 2 on price. With a lower price and a similar feature set to the iPad 2, a large segment of consumers might be willing to overlook some of the shortcomings on Android 3’s current tablet app gap.

While this plea seems to be falling on deaf ears at Motorola, apparently this idea resonated with one of its competitors, the Taiwanese-based PC giant Acer, Inc.

Today the company along with channel partner Best Buy announced pre-orders for its Android 3-based iPad competitor, the Iconia A500, which has nearly identical specifications to the Wi-Fi only version of the Motorola XOOM.

See: Motorola XOOM Specifications

See: Acer Iconia A500 Specifications

The only significant difference? 16GB of flash memory instead of 32GB which lines up with the entry-level Wi-Fi iPad quite nicely. At $450.00, the 10.1″ tablet device costs $50 less than the entry-level iPad 2, and is $140-$150 less than the XOOM.

It seems that Acer didn’t even need to sacrifice the camera capabilities to make up for component costs on the BOM in order to compete with the iPad 2. It also has identically-specced 2MP front-facing and 5MP, LED flash rear-facing cameras as the XOOM.

In addition to identical SoC and PoP, memory, RAM, GPU, screen, cameras and ports, the device is also the same thickness of the XOOM, at 12.9mm, roughly half an inch.

Note to all Android and other would-be tablet manufacturers: The bar for 10.1″ full size, 16GB devices that are not carrier locked has now been lowered to $450.00. That means anything in the tablet market that is not Apple needs to cost this much or less, with similar specifications.

Oh and HP, regarding your WebOS 3 TouchPad, I’m also talking to you.

Today I pre-ordered one of the Iconia A500 devices, sight unseen. If it’s anywhere near as good a device as the XOOM as far as build quality is concerned, I’m willing to put up with some of the current shortcomings in Android 3 in order to keep a test Honeycomb device in house.

I’m not advocating that most consumers look at this device in lieu of the superior application ecosystem on the iPad 2, but this price point was enough for me to take action to meet my immediate requirements. For the same reason, Android developers will probably jump at the opportunity to buy one of these devices as well.

The question now remains of how Android 3 devices can actually differentiate at the $450.00 price point, and if it’s even possible to lower that price bar any further. I think the price bar can be lowered, but there’s only one company who I believe can accomplish this: Amazon.

As I mentioned in my November 2010 piece “Kindle’s Secret Sibling: Amazon’s Android Tablet” it is my belief that the online retailer and bookseller has a definitive plan to release such a device.

What the exact specs for this device are is left to the imagination and may be pure speculation, but certainly if you eliminate the need for Google licensing of the Android OS and the Android Market along with Google’s apps, replacing them with Amazon monetizing equivalents, the price of a similarly-specced tablet could easily be reduced another $50-$80.

An Amazon Android tablet using the Open Source, unlicensed version of Honeycomb, combined with Amazon Appstore for Android,  Kindle for Android, the Amazon MP3 service as well as Amazon Video and Amazon Cloud Player makes for a near-complete tablet OS and application stack, which would allow the company to monetize their properties on the device and take the “Give away the razor but sell the blades” approach.

All of this is dependent, of course, on Google actually releasing the source code to Honeycomb outside of their approved vendors list, a delay which currently has the Open Source community absolutely livid.

The only thing that would actually be missing from such a theoretical device is a native Android mail client to talk to GMail, Hotmail and other services including corporate Exchange email, along with the appropriate calendar sync services, mapping and integrated search engine apps.

Presumably the missing pieces could be coded in-house by Amazon, or purchased/licensed from a 3rd-party. I could certainly see Amazon partnering with someone like Microsoft to provide Bing search and mapping in lieu of Google, as well as a full license of Activesync and native Exchange client.

Whatever the outcome, it’s clear that Acer has just brought prices for iPad 2 competitors down to much more realistic and acceptable levels. Whether it will plunge any further in the immediate future remains to be seen.

Has Acer re-defined the pricing for the full-sized Android tablet space? Talk Back and Let Me Know.

Once upon a time there was Ustream. Then came YouTube Live

YouTube has broadcast several live events in the past, but, as they noted on the official YouTube blog, they’ve all been “one-offs”. Internet users interested in streaming live video relied on Ustream and other similar services, while larger broadcasters streamed video independently. YouTube, as one of the most popular sites on the Internet, was essentially an asynchronous tool for Lady Gaga and cute cat owners.

 

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

 

Today, however, Google announced that YouTube would become a live broadcasting platform:

Today we’re announcing the initial roll out of YouTube Live, which will integrate live streaming capabilities and discovery tools directly into the YouTube platform for the first time. This begins with a new YouTube Live browse page (www.youtube.com/live), where you can always find the most compelling live events happening on YouTube and add events to your calendar.

Their initial live-streamed event will begin in a few hours with a YouTube-sponsored concert series, featuring the most-viewed independent artists from the video sharing site. More interestingly, though, is the fact that this may finally become the social tool that Google has failed to create in its years trying to compete with Facebook.

Although YouTube has featured comments on videos for some time, being able to comment and interact during a live stream takes this to a new level. Right now, aside from the so-called Digitour, the selection of live shows is relatively limited. As Google notes, they are “gradually rolling out [their] live streaming beta platform, which will allow certain YouTube partners with accounts in good standing to stream live content on YouTube.”

As this becomes more widespread, YouTube Live stands to not only change the face of broadcast (can you say Google TV?), but also stands to change the face of social. It’s one thing to interact over static media. It’s quite another to interact (and ultimately, expect to interact) with live video.

Given YouTube’s existing extraordinary reach, this stands to be a big deal. Like a really bit deal. We’ll see what happens when partners beyond the Indian Premier League cricket matches begin to go live. Because while some of the partners will be major media outlets, this has the potential to enable a new level of success for the pre-Justin Bieber’s of YouTube-land.

Parish declares that Facebook and Christianity don’t mix

A Roman Catholic parish in Chicago is warning parishioners about the dangers of Facebook. St. John Cantius parish leaders wrote in the church bulletin this past Sunday that Facebook is against the Christian culture. Social networking sites in general apparently encourage vanity and dishonesty by providing an outlet for children to create their own electronic version of reality, concocting their own identities and social realities with a reduced risk of real-world consequences.

 

Best Microsoft MCTS Certification – Microsoft MCITP Training at Certkingdom.com


 

Chicago Tribune has the full quote of what parish leaders wrote in the church bulletin:

[Facebook] is exactly the opposite of the Christian culture where people go into the secrecy and sacredness of the confessional to blot out their sins forever. God entrusted parents with the care of their children for one particular purpose, and that is to teach them the way “to know, love, and serve God in this life and save their souls hereafter.” Everything leads us to think that Facebook fits poorly into this plan and was devised for a very different goal.

The church wants families to raise children without Facebook, as it supposedly helps youth defy their parents and cultivate feelings of lust. It’s rather worrying that families trying to raise their children in a wholesome environment are being told to avoid rather than educate.

Kids are future adults, and must thus learn about this world as much as they can, since they’ll be the ones managing it one day. At least for the foreseeable future, Facebook is part of this world.

TheStartupBuzz VC Showcase

The first of its kind event in India will bring venture capitalists, angel investors, start-up founders and entrepreneurs under one roof.

TheStartupBuzz VC Showcase is going to be held at Capitol Hotel, Rajbhavan Road, Bengaluru on 31 October 2009. VC Showcase is the first of its kind event in India where venture capitalists and angel investors showcase about their firms, explaining about their expectations from entrepreneurs. Each VC/investor will have their own table set-up only for meeting great companies and entrepreneurs.

 

Best Microsoft MCTS Certification – Microsoft MCITP Training at Certkingdom.com


The event will be attended by VC/investor, start-up founders and entrepreneurs. It will bring together start-up founders to share experiences and interact with other start-ups. Three start-up founders will share their experiences of starting a venture, sustaining it and taking it to the next level.

Around 20 VC/investors will showcase their firms and 200 start-up CEOs and aspiring CEOs will meet and discuss during the event. In addition, there will be a wide range of sessions focussing on building successful start up, best practices, funding theory etc.

IT’s About Securing The Information DNA, And More!

The conference will provide opportunities for industry leaders, corporate decision makers, academics and government officials to exchange ideas on technology trends and best practices.



Microsoft MCTS Certification, MCITP Certification and over 2000+ Exams at Actualkey.com

 

Securitybyte and OWASP India, organisations committed on raising InfoSec awareness in the industry, are hosting an information security event called Securitybyte & OWASP AppSec Asia 2009 at Hotel Crowne Plaza, Gurgaon, Delhi NCR from 17 November to 20 November 2009.

The highlight of the event is India’s first information security focussed India Technology Leadership Summit 2009 with panel discussion on Security concerns for off-shoring between industry leaders representing outsourcers, service providers and regulators. The panel is being moderated by cyber security expert Howard Schmidt.

This year’s conference will draw attendance from information security professionals from all over the world. There are 18 international speakers coming in from USA, New Zealand, Sweden, Germany, UK, Canada, Thailand and Taiwan to talk on subjects like The international state of cyber security: Risk reduction in a high threat world by Howard Schmidt, Critical infrastructure security: Danger without borders by John Bumgarner, to name a few.

The conference has three main tracks focussed on security professionals, developers and leaders in the security space. Speakers like Kevvie Fowler will address the security professionals to talk about techniques used to bypass forensics in databases. Additionally, speakers like Jason Lam will reveal how their SANS Dshield Webhoneypot Project is coming along. Microsoft Security Response Center will reveal how things work under the cover in their team.

People attending the event will have the opportunity to partake in three different types of war games. These scenario-based games not only include attacking Web applications and networks, but also show how real world cyber attacks take place.

This event also marks an entry of international information security training leaders SANS and ISC2 who are conducting two days of hands-on trainings by their instructors from USA. The four-day event will also host many advanced trainings like advance forensics techniques, building advanced network security tools, advanced Web hacking, in-depth assessment techniques etc.

Click here to take part in the event.

Educated beyond common sense

A follow onto Knowledge Normalization Methodology.

A perspective on the evolution of technology and the corresponding view of educating the users to interface with next generation computer application, visions. The next frontier is understanding knowledge in such a way that it can be stored, interrogated, changed and managed according to business models. And engineering the meaning of relating knowledge elements according to specific: models, algorithms and formulae’s all seeking to explain the: how, why, who, where and when a ‘decision making’ action produced the correlated relational data elements.

 

 

 

From clay tablets transaction record keeping to punch card file processing, information was stored as data (fields) elements, retrieved and presented as knowledge regarding a business activity. A watershed was reached when the hardware evolution introduced random access, capability, to data elements, the Relational Data Base.

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

 

● The advent of Data Normalization Methodology.

This era of integrating RDB capabilities with management methodologies created a software industry to rationalize the results of human endeavors though Data reasoning logic.

The USER education continues;____________________________

Canonical synthesis

From Wikipedia, the free encyclopedia

Generally, in mathematics, a canonical form (often called normal form or standard form) of an object is a standard way of presenting that object.

Database normalization

Entity-relationship model

From Wikipedia, the free encyclopedia

A sample Entity-relationship diagram using Chen’s notation

In software engineering, an entity-relationship model (ERM) is an abstract and conceptual representation of data. Entity-relationship modeling is a database modeling method, used to produce a type of conceptual schema or semantic data model of a system, often a relational database, and its requirements in a top-down fashion. Diagrams created by this process are called entity-relationship diagrams, ER diagrams, or ERDs.

Data modeling.

From Wikipedia, the free encyclopedia

Data modeling in software engineering is the process of creating a data model by applying formal data model descriptions using data modeling techniques.

Database normalization.

From Wikipedia, the free encyclopedia

In the field of relational database design, normalization is a systematic way of ensuring that a database structure is suitable for general-purpose querying and free of certain undesirable characteristics insertion, update, and deletion anomalies that could lead to a loss of data integrity.[1]

Database Normalization Basics

By Mike Chapple, About.com Guide

….”If you’ve been working with databases for a while, chances are you’ve heard the term normalization. Perhaps someone’s asked you “Is that database normalized?” or “Is that in BCNF?” All too often, the reply is “Uh, yeah.” Normalization is often brushed aside as a luxury that only academics have time for. However, knowing the principles of normalization and applying them to your daily database design tasks really isn’t all that complicated and it could drastically improve the performance of your DBMS.

What is Normalization?

Normalization is the process of efficiently organizing data in a database. There are two goals of the normalization process: eliminating redundant data (for example, storing the same data in more than one table) and ensuring data dependencies make sense (only storing related data in a table). Both of these are worthy goals as they reduce the amount of space a database consumes and ensure that data is logically stored.

● The advent of Data Normalized

Project Management Methodology.

In this era of integrating RDB capabilities with project management methodologies created new software industries specializing in project management products and services specific to particular market industries.

The USER education continues;____________________________

from: The history of PERT Network and project management.

Wiest, Jerome D., and Levy, Ferdinand K., A Management Guide to PERT/CPM, New Delhi: Prentice-Hall of India Private Limited, 1974

1. INTRODUCTION

Basically, CPM (Critical Path Method) and PERT (Programme Evaluation Review Technique) are project management techniques, which have been created out of the need of Western industrial and military establishments to plan, schedule and control complex projects.

1.1 Brief History of CPM/PERT

CPM/PERT or Network Analysis as the technique is sometimes called, developed along two parallel streams, one industrial and the other military.

CPM was the discovery of M.R.Walker of E.I. Du Pont de Nemours & Co. and J.E. Kelly of Remington Rand, circa 1957. The computation was designed for the UNIVAC-I computer. The first test was made in 1958, when CPM was applied to the construction of a new chemical plant. In March 1959, the method was applied to maintenance shut-down at the DuPont works in Louisville, Kentucky. Unproductive time was reduced from 125 to 93 hours.

PERT was devised in 1958 for the POLARIS missile program by the Program Evaluation Branch of the Special Projects office of the U.S. Navy, helped by the Lockheed Missile Systems division and the Consultant firm of Booz-Allen & Hamilton. The calculations were so arranged so that they could be carried out on the IBM Naval Ordinance Research Computer (NORC) at Dahlgren, Virginia.

● The advent of Data Normalized

Business Intelligence Methodology.

Now begins an era of reverse engineering the meaning of relating data elements according to specific models, algorithms, formulae’s and others from differing RDB designs; ORACLE, SYBASE, Microsoft, all seeking to explain the how, why, who, where and when a ‘decision making’ action produced the correlated relational data elements.

Each industry using RDB technology (that’s pretty much everybody) has developed unique language descriptions to best fit their business, product and services operations. And when combined with each industry’s unique product or service needs, has created a new software niche for developing ‘business theory models’ for analysis between software logic and business logic.

The ongoing business intelligence software development process will continue to orchestrate language as a source of knowledge to explain the how, why, who, where and when an action produced the correlated relational data elements. This process is ultimately destined to examine business knowledge, frozen in time and rewriting the language (English Grammatical Sentences, RULES) by which decisions are made that affect business operations. This process simply reverse engineers (rewrites) the business language that best explains decision making at a particular point in time. Real time access to the language of business decision-making would therefore provide for faster reaction to changes for the operation of the enterprise. To achieve a real time relationship with an enterprise operation requires that the language of the enterprise become normalized, for an enterprise operation’s Relational Knowledge Base.

The USER education continues;____________________________

Business intelligence

From Wikipedia, the free encyclopedia

Jump to: navigation, search

Business intelligence (BI) refers to computer-based techniques used in spotting, digging-out, and analyzing business data, such as sales revenue by products and/or departments or associated costs and incomes. [1]

BI technologies provide historical, current, and predictive views of business operations. Common functions of Business Intelligence technologies are reporting, online analytical processing, analytics, data mining, business performance management, benchmarking, text mining, and predictive analytics.

Business Intelligence often aims to support better business decision-making.[2] Thus a BI system can be called a decision support system (DSS).[3] Though the term business intelligence is often used as a synonym for competitive intelligence, because they both support decision making, BI uses technologies, processes, and applications to analyze mostly internal, structured data and business processes while competitive intelligence is done by gathering, analyzing and disseminating information with or without support from technology and applications, and focuses on all-source information and data (unstructured or structured), mostly external, but also internal to a company, to support decision making.

● The advent of Knowledge Normalization Methodology.

IDENTIFYING KNOWLEDGE

From: WorldComp2010, Published research paper:

Paper ID #: ICA4325

Title:      Knowledge Normalization Methodology

ICAI’10 – The 2010 is the 12th annual International Conference on Artificial Intelligence.

Title:      Knowledge Normalization Methodology.

Introduction to CAMBO, a multi-Expert System Generator, and a new paradigm knowledge normalization (patent pending). “Any entity’s knowledge, which can be described in “English Grammatical Sentences”, can be extracted and software (Rule processing) managed through a Normalized Knowledge Base.” Unlike Single EXPERT system generators, AION, Eclipse, XpertRule, RuleBook, CAMBO’s kernel logic; Knowledge Normalization Methodology is based upon a methodology that closely observes the rules of medical science in identifying, applying and developing the science of machine intelligence.

Abstract of the Disclosure.

The invention’s name is CAMBO an acronym for Computer Aided Management By Objective. The title is a “multi-EXPERT System Generator”, and the vision an “artificial intelligent bridge between technology and the ability to automate the instruments of the MBO methodology, namely: Charters, Organization Charts, Operational Plans, Project Management, Performance Planning and others all containing the knowledge, expressed in ‘English Grammatical Sentences’, upon which an enterprise conducts business. It would require the design of a unique combination of advanced methodology and technology capabilities built upon and work in concert with current state of the art, ‘Data Normalized’, Relational Data Base structure. The “AI Bridge” would include an advanced methodology for Normalizing Knowledge, a unique definition for a unit or element of knowledge, an advanced structure for a Spatial Relational Knowledge Base and a 5th generation programming language to support a Natural Language Processing interface.

The USER education continues;____________________________

from: International Cognitive Computing, CAMBO a multi-Expert system generator.

What is it? Each employee of a business enterprise is a human expert possessing a measure of talent, developed through the experience of practical application, interaction and training with other employees. The result of job related experience is an acquired understanding of: How, When and Where each employee is expected to contribute towards the operation of an enterprise. This understanding is expressed and communicated through simple conversational language, in which each language sentence represents a “knowledge element.” CAMBO describes knowledge in segments called elements with each element structured as a simple sentence, in which the employee expresses a single step, that when added to other related elements provide a complete description of a job related task. CAMBO imposes a single convention upon the formulation of each element, as a control parameter to support CAMBO’s fifth generation programming language called “Language Instructions Per Sentence” (LIPS). The convention requires that each sentence must begin with an action verb, selected from the CAMBO-LIPS “Action Verb List.”

“Analytical thinking requires the ability to compose business issues into logical and functional models that correctly reflect business processing, coupled with the skill to communicate the results to all levels of an organization”

Normalizing Knowledge

Knowledge engineering, which includes methodologies, techniques and tools, produces knowledge models for populating a storyboard layout for the design of a multi-expert system. Each knowledge engineering model is a particular life cycle view of activity and it models the functionality of a knowledge engine that drives the events within the life cycle. These models identify, capture, profile and relate the language of the enterprise for which the multi expert system supports. Knowledge engineering models the relationship between the business practices of an enterprise and the functionality of an ERP methodology and this information contributes toward the knowledge normalization process. The methodology for knowledge normalization expresses a knowledge element as an English grammatical sentence. Knowledge engineering codifies the business, science and engineering knowledge into its most basic form, the English Grammatical Sentence (EGS).

Each EGS is grouped into rule-sets that become part of a knowledge domain and because the knowledge normalization process establishes cross-domain relationships, the knowledge of many disciplines unite to answer questions. The procedure for asking questions is a simple, intuitive and interactive web based menu system that leads the user through a question and answer cycle – including a cross discipline review of the issues leading to a final answer. It responds as though the questions were asked of numerous engineers or business managers, in different disciplines, all contributing their knowledge towards identifying and answering issues on a specific business or engineering requirement. However, while the methodology for data normalization remains as a standard for developing a relational data base, the processes described are integrated with the processes for knowledge normalization. Data element: definitions, profiles, format, relationships, where-used and ontological associations all compliment the process of knowledge normalization.

“The methodology for knowledge normalization expresses a knowledge element as an English grammatical sentence. Knowledge engineering codifies the business: philosophy, science, engineering, and art’s (the four prime domains of knowledge) “Relational Knowledge Base”, into its most basic form, language”

Knowledge elements contain the logic by which data elements are created and manipulated. The status or condition of any individual data element is the result of a computer program, executing a series of knowledge elements (rules). Knowledge engineering is used to front end the data methodologies for: ORACLE, SYBASE, DB2, FoxPro, SAS, SAP, IEF, Microstrategy and all RDB application generating software.

For those readers that remember Herman Hollerith, congratulations for still being part of the technology evolution landscape. We have witnessed and been part of a most incredible industry journey. For this reason my hope is that your common sense, derived from decades of structured thinking, will view knowledge normalization as the next logical step for the technology evolution. Within this article I have expressed a direction for next generation technology towards a language based methodology to normalize knowledge. And while it does not replace relational data base structure, knowledge normalization (storyboards, rules and NLP) will identify and relate data elements and function as a front end designer for building a RDB.

Knowledge normalization raises many questions about design, structure and most importantly its business application. Consider the capability of reviewing, monitoring and testing new ideas about the manner in which, an enterprise conducts business. Reviewing the decision-making relationships that reflect the rationale and reasoning logic of the individual employee job description. Monitoring the working relationships between employees in which each job description, knowledge elements (rules) are associated with all other employee job responsibilities. Testing the knowledge storyboard by changing the knowledge elements (English grammatical sentences) and coordinating your changes with ALL employees’ decision-making job description rules (English grammatical sentences).

The future of Knowledge normalization will reflect a repeat of the post data normalization technology evolution. Business intelligence, project management and process control about the manner in which, an enterprise conducts business will need ‘application developers’ to reprogram their products. The migration from data base systems to knowledge based systems will pass from gradually integrating and adapting new technology interface to adopting the benefits of real time decision-making management.

*  Special appreciation and kudos to the folks at Wikipedia, the free encyclopedia, that provides an open forum for the development and communication of new ideas.

Dells new servers come preloaded with hundreds of virtual machines

Dell on Thursday served up a new choice for buying its servers: plug-and-play configurations that include up to 200 VMware virtual machines along with the networking and storage needed to run them.

Dell is also offering a ready-made virtual desktop infrastructure in the same fashion, letting users buy servers pre-configured with hundreds to thousands virtual desktops in two flavors: VMware or Citrix XenDesktop.

 

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

 

CLOUD NEWS: Rackspace, Dell push OpenStack cloud OS

But wait, there’s more. The company tossed in an announcement about a new e-mail backup and archiving service and expanded partnership with Microsoft, then promised to build 10 new data centers worldwide in the next 24 months – three in the U.S. — to support what it hopes will be a massive move to its cloud. In addition to hosting virtual storage, desktops and e-mail backups, Dell says these data centers will serve up VMware, OpenStack and Azure clouds. All told, the company says it is spending $1 billion  on the cloud.

Dell executives explained that despite what virtualization has done for hardware efficiency, it isn’t easy to manage. “Customers are more interested in buying VMs than in buying the physical hardware they have to assemble,” said Stephen Schuckenbrock, president of Dell Services, during a telephone press conference this week. He adds that research reports predict that soon “virtually half the workloads will be running on virtual machines … but it’s not that easy to do, to optimize those virtual machines.”

Enter the new vStart line of servers. These are built with the Xeon-based Dell PowerEdge, VMware hypervisors, Dell EqualLogic storage and Dell PowerConnect switches (which Dell OEMs from Juniper) and includes essential virtualization management extensions. The infrastructure is pre-assembled by Dell, and delivered to the enterprise’s site as a single unit, racked and cabled. Customers can own these units, though Dell is also willing to set them up as managed servers, promised Schuckenbrock.

The vStart 100 is a unit preconfigured with 100 VMs and priced at $99,000 while the vStart 200 is configured with 200 VMs and is priced at $169,000. The vStart family will eventually include configurations that offer more and less VMs, said Schuckenbrock, but these two standard configurations are available now.

MORE ON DELL: Dell servers preloaded with Ubuntu Enterprise Cloud software

Not wanting to completely tick off the mighty Microsoft, Dell also announced a vague three-year agreement with Redmond that will let virtualized systems be managed by Microsoft System Center. This agreement will one-day also allow Dell to offer Hyper-V as a VM choice. When the Hyper-V option materializes, it will be a less expensive choice, hinted Praveen Asthana, vice president, Enterprise Solutions & Strategy. “Microsoft System Center and Hyper-V will give customers more choice and radically change the economics of virtualization,” he said. However configurations available today star VMware.

Plus, Dell is encouraging enterprises to take the plunge with VDI by similarly offering plug-and-play servers bundled with hundreds or thousands of Citrix XenDesktop or VMware virtual desktops. Customers can choose the number of desktops according to their needs. Dell technically announced its Desktop Virtualization Solutions (DDVS) offering in March but on Thursday reiterated that the VDI bundle can be had anyway the customer wants it: as an owned piece of gear, as a managed service, a cloud service, or some kind of custom package in between, with Dell doing all the integration work to ensure customers’ apps work with it. It, too, is based on Dell PowerEdge servers, EqualLogic storage and PowerConnect networking.

The Best 3D TVs

For all the hype they’ve received in the past year, 3D TVs aren’t exactly flying off store shelves. The biggest problem is that there’s still very little 3D content available. Also, when they started hitting the market last year, 3D sets were priced much higher than their 2D counterparts. Now, however, prices have come down a bit—probably because these models aren’t selling as quickly as manufacturers had hoped.

 

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

 

If you’re in the market for a new HDTV anyway, it doesn’t hurt to get a 3D-ready set if you can find a deal. After all, all the 3D TVs listed below deliver excellent 2D performance too. One thing to keep in mind, though: 3D TV doesn’t always just mean paying for the set itself. Many manufacturers sell the required glasses separately for as much as $150 a pair. So if you want to be able to enjoy 3D with family and friends, tack on a few hundred dollars the price of your TV. (The only exception here is Vizio’s XVT3D650SV, which uses passive 3D, and four pairs of glasses are bundled with the set.) Also if you’re going to watch 3D Blu-ray discs, there’s also the cost of a 3D-enabled player. But the good news is that you don’t have to buy everything at once; you can get the set first and add the 3D accessories later.

If you’re ready to make the move to 3D, check out our list of the best 3D TVs below, along with current street prices, or compare these 3D-ready HDTVs side by side. For a top-rated 2D TV, check out The 10 Best HDTVs. And for general HDTV buying advice, read How to Buy an HDTV.