Datamarts are powerful tools for corporate departments, but you need a strategy to keep them out of each other’s way.

They’re like snap-on plastic building toys: Solid. Unbreakable. You can snap a datamart onto the main latticework, and you’ve created a new and profitable corner of your world. Remove one and the rest of the structure is hardly affected. It can be whatever you want it to be. And it’s so easy, a CFO can put one together.

“The overall structure is somewhat complicated, but the end user gets a Fisher-Price toy,” says Jim Curran, senior VP of MIS at State Street Boston Corp., which performs custodial services for the world’s biggest pension funds and institutional investment portfolios.

“The users are generally not technical people, so we had to build something powerful and still avoid the usual pitfalls of datamarts,” Curran says.

The usual pitfalls of datamarts can cost a fortune. Departments pull data down from a warehouse, or in many companies from an operational database system, and slice and dice the numbers to suit their own needs. In today’s decentralized environment, such data quickly becomes entangled in a confusing web of meaning and non-meaning. Is Smith on Main Street the Smith Company or Dr. Laura Smith? Did we sell them a single shipment of white paper for $1,000 last year, or 200 separate reams for a total of $1,0007 Or was it a “white paper,” as in “report?”

It gets even more complicated when a business’ assets are as high as State Street Boston’s: $2.7 trillion in custody and $245 billion in investment portfolios. With these numbers, the metaphor of a toy may be scary at first, but datamarts that can be set up easily are just what the banker ordered.

Removing error

To get all these seemingly contradictory qualities in one system, State Street in 1994 purchased the Essbase OLAP tool from Arbor Software of Sunnyvale, Calif. At the time, State Street was running off multiple mainframe databases and data was extracted and manipulated with Eicon Access and other tools. Analysts parsed the data into spreadsheet formats and then into reports. Budgeting was time-consuming and pricing of services lagged behind market information. Financial officers couldn’t go directly to their data–they had to go through a brigade of programmers, analysts, and administrators whose task was to keep the manipulated data of each department from poisoning the raw data of the corporate decision-support systems. Consequently, managers had problems forecasting profitability from month to month. They needed to get closer to their data.

Despite his title, Curran isn’t a technical person by training. He became VP of information systems when he introduced OLAP computing to State Street just two years ago. At the time he was a financial analyst for the company. But he is now keeper of the corporate datamarts for some of the most powerful users in the world.

One of them is John Spencer, controller for the Global Investment Services Group within State Street. “Clients purchase a fairly sophisticated package of services and we have to keep a close watch on the relationship between services offered and profits,” he says.

Spencer’s department uses OLAP templates to drill all the way down to client-level records. Every 30 days, he and his financial staff can see which clients are using what percentage of which services.

“It’s a fast car,” Spencer says. While “a few people can figure out how to use it by doing it,” Spencer says, “we still need more training and we’re in the process of taking the training.” Meanwhile, “we also can hit the database with standard Excel spreadsheets.” Other users tap into the data through Microsoft or Lotus spreadsheets or through Pilot Designer EIS, the front-end product State Street used before it began using the Essbase client.

State Street now has more than 20 Essbase databases totaling about 7GB sitting on a Windows NT twin processor Compaq ProLiant server with 20GB of storage capacity.

State Street keeps the data clean by using only Essbase databases. They are fed operational data from SQL databases and from a general-ledger database built on Computer Associates’ Datacom software sitting on an IBM mainframe. Only a few connections require a human expert to scrub and prepare data. “The back end of this operation is extremely clean,” says Curran. By the end of 1997, Essbase will let users store data in a relational datastore or in the native OLAP Essbase.

One of the early popular approaches to multiple datamart building was to build a huge central data warehouse–as opposed to a distributed data warehouse–and allow departments to create datamarts from that. “But then horror stories started to come out about bad data,” says Kirk Cruikshank, VP of marketing at Arbor Software. “That’s the kind of thing that made Jim Curran go for the distributed datamart model.”

What’s a datamart?

Experts would call State Street’s datamart architecture the perfect setup designed to keep departments from bumping into one another as they create their own versions. In a water-shed article in Datamation (“Are multiple data warehouses too much of a good thing?” April, p. 94), datamart pioneer W.H. Inmon notes that typical datamarts contain “both summarized and customized data that reflect the individual tastes and needs of the sponsoring departments.” Inmon contrasts that with the “current-level detailed data warehouse,” which is a central storage facility.

“Many sites call a small data warehouse a datamart and this is the beginning of their trouble,” says Dave Gleason, director of authorings and methods at the Information Consulting Group of Oakbrook Terrace, Ill.-based Platinum Technology. In this scenario, the user organizations often build their datamarts before the corporation has built its warehouse, a guarantee of future data clashes and errors. “The idea of building a datamart before you start a warehouse can be good only if the datamart builders commit to some baseline level of data architecture,” Gleason says.

At LTV Steel Corp. in Cleveland, multiple datamart building is architected tightly by using a single product–SAS Institute’s SAS/Warehouse Administrator–end to end. Over the past five years, data warehouse chief Robert Scharl has built 18 datamarts on a 40GB warehouse. LTV says it has realized much more than a 100% return on investment with the SAS warehousing and datamining tools. “We saved more than $16 million in just a few projects,” Scharl says. The one-product architecture has contributed to the bottom line more than anything else.

“Initially I looked at what Oracle and Sybase had to offer and I realized it would cost an extra half million dollars right from the start,” says Scharl. Much of the extra expense would have been the cost of administering an environment that, while more open than the single-product architecture, would have been more complex to manage.

Use any tool

Platinum’s Gleason says that in order to keep data clean across multiple datamarts, “what is critical is that you have a set of enterprise reference data. It follows the dimensional modeling approach: the set of codes and decodes I am going to use and the dimensional decompositions–this is called dimensional hierarchy.”

In Gleason’s theoretical environment, if you build a West Coast sales database and a Northeast sales database you can use results from both to tally sales for the whole group by applying the same definitions to terms for products and services, even if the datamarts are built on different tools and databases and have different levels of granularity.

At Ryder, the Jacksonville, Fla.-based truck rental company, data confusion has never been a problem because the underlying data warehouse is distributed. “The datamart strategy came out of our group,” says Jim Sutter, data warehouse administrator for Ryder. “We insisted on naming considerations and data typing that were uniform.”

Ryder also chose mission-critical business projects to lay the groundwork for the datamart strategy. “Initially we built datamarts to help follow the repair records on our trucks,” says Sutter. “It’s a lot cheaper to pull in a vehicle for routine maintenance than it would be to provide emergency roadside assistance.” Later, datamarts told Ryder where tracks were located so a dealer who was running low could quickly find the nearest rental site where trucks were sitting idle.

Ryder rents data

The heart of the Ryder datamart system is a decision-support system distributed across more than 80 AS/400s throughout North America. The initial data warehouse on which this was built was subject-oriented, fully integrated, and read-only to the end users. That was three years ago.

“We learned over time to give the end users more freedom to use the data,” Sutter says. But having an enterprise data model gives Ryder’s central IT mariagets control over corporatewide data.

Ryder’s main database is a DB2 common server running on UNIX. The individual datamarts are built with Visual Warehouse, IBM’s datamart software, and run with the VW Administration set.

Since Visual Warehouse has FTP capabilities, Ryder is now building metadata for the Web so that departments can create instantaneous datamarts on the corporate intranet. One result of this, Sutter hopes, is that departments will be less likely to drop below his radar and set up their own renegade architectures.

“Every once in a while someone builds something to answer a question practically on the fly,” Sutter says. “Typically, the ones built outside the rules die on their own real quickly. But we don’t worry about that.” The few that survive will show up sooner or later, but Ryder can make a copy of a renegade datamart, clean up the copy to conform to the enterprise model, and run the copy parallel to the renegade datamart.

Ryder’s ability to build parallel datamarts to outmaneuver any renegades is largely due to the rapid and relatively inexpensive development permitted by the IBM tool. “For $50,000 you can have a major-league datamart up and running in a month,” says George Zagelow, IBM’s business intelligence manager.

Ryder uses IBM’s DB2. Other Visual Warehouse customers deploy Oracle7, CA-Ingres, and Sybase plus IMS and assorted other flat files.

At State Street Boston, Curran has built 20 datamarts in two years. “I started with Essbase as sort of a project,” he says. “I wrote a lot of the earliest databases with no outside help. If the tools are easy to use, you can build lots of datamarts and manage them.”

And Arbor says it is going to get better soon. It refers to relational/OLAP datamart building as the third wave of data management. “It will allow those who already have made a big investment in relational data to have a foot in each camp–OLAP and relational,” says Arbor’s Cruikshank.

To the end user, this will once again appear to be as simple as playing with snapon building blocks.//

Innovative datamart strategies

Vendors and corporate customers alike have produced a range of solutions for multiple datamarts in recent months.

IBM and Sybase have teamed up to offer a Data Mall that ships on an RS/6000 system. A data malt is a data warehouse and datamarts that come ready, out of the box, for your data. The package provides network analysis tools in an effort to overcome performance problems inherent in multiple datamarts. Sybase Enterprise Connect middleware comes with rules for building a datamart out of the central data warehouse, thus automating the techniques perfected by organizations like Ryder.

Oracle’s Data Mart Suites, which just started to ship at press time, include a variety of tools from partner organizations, all aimed at organizing a multiple data-warehouse strategy. For instance, the Oracle package will ship with tools from Menlo Park, Calif.-based Sagent Technology that allow administrators to move data by linking a series of icons. The linked icons create data-flow plans that show which data is flowing from which source. Upon approval of the data-flow plan, the administrator can point and click at a starting icon and begin the data-moving process. This is designed to allow for rapid prototyping and deploying of databases.

Sagent’s data-flow facility is also shipping with Prism Solutions, long a presence in the high-end datamart business. Officials at Sunnyvale, Calif. -based Prism say they will use the Sagent software to provide workgroup datamarts that can be built in hours. As with SAS, Prism takes the single-product road toward datamart integration. Prism’s Scalable Data Warehousing Solutions package the software in five different ensembles, ranging from terabyte data warehousing to multimegabyte datamarts.

Informix, meanwhile, is shipping its NewEra database environment with a variety of tools for multiple datamarts, including Woodcliff, N.J.-based Computer Systems Advisors’ new Silverrun software for forward and reverse engineering of departmental warehouses and large-scale datamarts from Informix data. Users of the Silverrun workbench can model a workgroup repository from Informix data, allowing the data warehousing group to rapidly build departmental datamarts from the same central database, thus controlling business rules and data definitions.

Innovating in a manner similar to that of the IBM Visual Warehouse group, Computer Associates in April announced Workgroup OpenIngres, a DBMS that ships on NT on Intel platforms, NT on Alpha, SCO OpenServer UnixWare, and Sun Solaris for Intel. The idea is to provide a scalable database upon which a warehouse could be built at the enterprise level or a datamart at the workgroup level. Data from that workgroup datamart eventually could be blended with the enterprise system to maintain data definitions and rules. Workgroup OpenIngres ships with OpenIngres Internet Commerce Enabled (ICE) to permit distribution of datamarts across intranets and extranets.

Informatica of Menlo Park, Calif., which began shipping its PowerMart distributed datamart environment in 1996, builds the data warehouse from the variety of datamarts. Using the Informatica Metadata Exchange (MX) architecture, the IT department can integrate data from a variety of OLAP tools, including Andyne, Brio, Business Objects, Cognos, Information Advantage, Infospace, IQ Software, and Microstrategy. The Informatica approach assumes that departments will build their data warehouses independently, and thus bases the future distributed warehouse on the company’s various datamarts.

Vmark Software’s Datastage also builds its environment from disparate datamarts.

NCR, a leading outsourcer of data warehousing projects, comes into the picture mostly after the warehouse is built, insisting that IT rationalize the entire enterprise-database environment. The data warehousing model of NCR layers data types: operational source data on the upper layer, followed by data transformation, followed by a data warehouse built from the “corporate memory” (operational databases), followed by a single RIDBMS that contains a single version of the truth. Against this, all disparate data warehouses are examined and catalogued so that the technology, no matter how innovative, never gets in the way of business.

According to NCR senior consultant Rob Armstrong, “An important aspect to understand about the architecture is that the information access and datamining tools need to access both datamarts and the enterprisewide data. If the tools do not have uninhibited access to the correct levels of data, then the warehouse itself is severely limited in its capacity to provide value to the end user.”

Laptop Data Recovery Solutions

Are you troubled by a crashed laptop hard drive that is making your treasured files inaccessible? Fortunately, there are solutions to this problem. Laptop data recovery can be achieved through various methods as in this article.

One of the common methods to retrieve those precious data is to remove the hard drive from your laptop and insert it into an external enclosure, a device which can be purchased from stores. External enclosures can connect the hard drive to a working computer via USB cable. If you are successful in browsing through your hard drive, then you can now transfer your files either to another external hard drive or to the computer you are working at. Be prepared to wait for many hours if you have large files. When the transfer is done, close the window to your hard drive. The good thing about this is that your hard drive is physically fine but you need to reinstall your operating system.

If you are a Mac laptop user, get a FireWire cable. Borrow a working Mac and use the cable to hook your hard drive into it. While connecting, remember to keep the power of the working Mac OFF. As you turn it on, press the T key and hold it down until you can see the FireWire icon. This will enable you to ‘Target Boot’. If successful, you can now retrieve your needed files.

Other options would be using recovery software that you can purchase from local computer stores or you can download it at the reputable sites. There are free downloadable laptop data recovery software and there are some that require a fee. But when the situation calls for drastic measures, availing the services of a certified laptop engineer would be the best course.

Laptop data loss occurs because of different reasons. Sometimes the reason could be due to software malfunction like problems with the operating system or damaged software. Another reason could be defective hard disk drive heads or hard drive more failure. Data may also be lost through unintentional data deletion. The list for possible reasons of data loss could go on, but whatever the reason is, the result could be the same. Data loss leads to frustration and disappointment, and worse still, it can lead to serious financial problems or job loss.

When your laptop fails and you need to extract the data, there are some possible ways you can go on with laptop data recovery. You can do the retrieval process yourself and use software data recovery that is being offered at an affordable price online. Software data recovery works effectively in retrieving files and fixing partitions. There are many different kinds of software available and each of them has special features, so decide based on what you need, not what you want. Another option for laptop data recovery would be to use a repair service. Laptop repair services are more effective in drawing out files from your laptop because they employ professionals who are more experienced and skilled in data retrieval. Of course, their efficient service comes at a good price.

People of the 21st century depend highly on their high tech gadgets for work and relaxation. The computer and internet play vital roles as both are means of developing business and work efficiency as well as finding better audio and visual enjoyment. Though computers in the 21st generation are the most sophisticated and the best that have been developed over the years, you must also remember that computers are still machines, and sometimes they fail. Most likely, they will fail at the moment you least expect or want it to fail, and like Murphy’s Law, things will go downhill from there.

Data recovery companies can help.Computers are not created foolproof, and as such, users are advised to always back up files and programs on their own computer systems. When you have failed to backup and your computer crashed because of a virus or a physical hard drive malfunction, then it is time to seek for professional help. Because your data is important and necessary, you should set out to find a service that can most efficiently recover your much needed files and at the same time attempt to repair your computer’s hard disk. This is where a data recovery company comes in. Do not confuse this with your local computer repair center; there is a difference between the two. Local centers can assist you with repairs, but they might not have the expertise needed to retrieve data from the damaged hard drive. Data recovery companies on the other hand, specializes in reclaiming data from drives that have physical damage or computers that stubbornly refuse to boot. So when your company fails you and you have done everything you can, turn to a high quality data recovery service for much needed assistance.

The Benefits Of Choosing A Good Data Recovery Company

If your laptop or PC hard disk has sustained physical damage, no matter what the cause is, the result can be computer malfunction and data loss. Had your data been backed up, computer drive failure is of no consequence. But if you failed to back up your data, then you may be facing a very difficult challenge, especially if the lost data is pertinent to your business, work or project.

In times of computer malfunction, you can turn to the services offered by a high quality data recovery company such as Hard Drive Recovery Associates. Such a company was built and designed for the specific purpose of retrieving data files from physically damaged laptops, PC drives and RAID arrays, and they have the right equipment and tools along with the necessary expertise. One benefit of using a data recovery service is that it is saves time. Imagine the trouble you would get into if you bring your computer to a repair shop and you were told to pick it up after three or four days. With a professional data recovery service, it is guaranteed that your files will be checked and retrieved right away.

The second benefit this kind of service provides is that it is flexible. You can have your laptop repaired online or offline. This makes data recovery companies a big help to everyone.

The third benefit is that it is cost-effective. Companies like these provide various solutions or programs for you to choose from, and you can choose the right solution for you depending on your budget.

Laptops, notebooks and netbooks are the most used computer products these days because they are portable, yet still provide great storage. Laptops can be used anywhere, whether you are on the train commuting, or doing Powerpoint presentations in the office or with clients. There are many wonderful advantages of using laptops, but there are also some very distinct disadvantages. For one, because laptops are carried everywhere, they are prone to various movements like bumps, shakes and falls. And because laptop components are miniature versions of those of a personal computer, there is very limited space inside, which makes laptops more fragile and sensitive than desktop computers. It also makes laptops more prone to hard drive problems.

When your laptop fails, do not attempt to open it and fix the problem on your own. Even if you accidentally spilled water or juice on your laptop, it is not a good idea to try to clean the internal components on your own. The best thing to do would be to take your laptop to a repair hard drive service and have a professional work on your hard drive. That way, you can save both the computer and the data it contains. Hard drive repair services are very efficient in retrieving data and fixing hard drives, and they also offer a good price for their service.

It is very common to find yourself with a damaged hard drive that needs repair. Most people end up being so confused after the experience that they are desperate to get a hard drive repair. Some people have even damaged their hard drives more as they tried to correct what they thought was a simple malfunction of their computers. To repair hard drive is a task that needs expertise.

It is essential that you consult an expert because a simple mistake can cause your hard drive to lose all the data in it. The data could be very essential for your business or any other purpose. Some people have ended up regretting why they decided to repair hard drive after they loose on all they had in it. Some of the frequent mistakes people make are trying out the instructions from video on repair hard drive from the internet. It does not work at all. Another mistake people make is calling on the assistance of computer repair companies rather than hard drive recovery companies. There is a great difference between repairing computers and repairing hard drives. It is also imperative that you use certified software to repair hard drive.

The Mac computer may undeniably be one of the most reliable computers there is, but like any machine, it also breaks down. When glitches occur in your Mac computer, there is no need to panic. Even if the cause of the problem is your hard drive, Mac hard drive recovery is very possible.

Before you troubleshoot your Mac computer, you much first determine what kind of hard drive failure you have. If you can hear a whirring noise when you open your computer and nothing appears on the screen, it means you have mechanical failure. This kind of failure is the most feared kind of computer crash because it means that you can only retrieve the data by bringing your computer to data recovery experts. You can try to retrieve the data on your own, but there will be less chance of success, since mechanical failure cannot be worked out by using software programs, even advanced software recovery programs. On the other hand, if your computer does not emit any whirring sound, it is very likely that your computer is experiencing logical failure. This kind of hard drive failure happens when the file system is corrupted, so the drive cannot find the data. With logical failure, the data may still be in the drive, and you can extract it by using any of the advanced recovery software available for Mac.

RAID Recovery and Its Functions

Over the years, RAID (redundant array of inexpensive disks) has been used in several offices. RAID is used to store several data on disks to a single storage. The disks are commonly known as member disks. However, multiple disks often get complicated and fail to portray its function. When disks are in proper RAID configuration, the computer instantly detects it as one large disk. Nevertheless, RAID functions better compared to one hard drive only. Because data is spread all over the disks, reading any operations can take place on multiple disks simultaneously. RAID recovery is one solution that would perfectly minimize the problem.

Recovering data becomes a major problem at this time. To help you with this problem, contact an expert with RAID recovery. Look for the best service providers in town and tell them your case. They would definitely be glad to help customers solve their problems. Make sure you talk to the trusted representative and follow its guidelines efficiently.

In RAID recovery, the hard disk is duplicated so that the original data and its content can be saved. It is saved in one astonishing memory. Keen observation and techniques must be learned. When the controller card is destroyed, the controller algorithm is immediately reformulated.

Several RAID data recovery service providers offer various ways of recovering data from RAID systems. For many years, RAID recovery is one of the most difficult tasks for technicians. It will take several days to weeks for the system to function well again. If you have lost data on RAID system, chances of recovering it is low. However, the existence of many RAID data recovery services greatly helped people in data recovery. Most service providers utilize their knowledge and skills to address to your needs. RAID configuration needs extensive training so technicians specializing RAID recovery have gained adequate experience in handling different cases. The technicians or engineers use latest equipments to get the best of retrieving your data. Every RAID level requires distinct RAID recovery skills.

After giving a free evaluation of your system, a complete report of the files that can be recovered is provided. You can decide whether to continue availing their services or not. All RAID recovery jobs are performed inside their facility. If the data is unrecovered, no fee is collected. Generally, it would only take at least 2 to 5 days to recover data. If you immediately need the data, some cases are completed within the day. All data recovered are kept with strict confidence.

External hard drives are used to store data in order to ensure that that data is accessible on a portable or mobile level. Much like regular internal hard drive, its storage capabilities are permanent if the drive is maintained properly. The external hard drive is viewed Be careful with your external hard drive or risk failure.

to have three major specs; these are speed, amount of storage space and power. External hard drives vary in power consumption depending on their size. For instance, a one terabyte external hard drive will you more power compared to a three-twenty gigabyte hard drive. The speed also is another aspect that manufacturers take into consideration. For larger external hard drives manufacturers tend to increase overall the to allow faster use of the device and to allow it to access data quicker on the platters.

Clearly, the key problem with an external hard drive is the same as with internal hard drive they tend to fail after a certain amount of use. This is because the platters are spending all of the time and unless the drive is turned off, they will continue to spin. Naturally, because this is a mechanical device, is going to be very likely that the drive will seize at some point. But keeping it well maintained by ensuring that the disk is error-free and that it is always defragmented in the best way to ensure that your drive is going to last a long time.

The external hard drive is best used sparingly because keeping it powered on all of the time will make it subject to intense wear and tear. This kind of damage usually leave people to have to consider a professional external hard drive recovery service, especially if the data contained on the external drive is very important to them. The portability of the drive suggests that you should probably not use them for very critical data because when a hard drive is portable, it has a higher chance of being damaged just through regular use or simply by moving it from location to location. I know that a lot of people do like the idea of the portability of an external hard drive, but you do have to remember that they are much more likely to suffer from hard disk failure than those hard drives that simply sit in a PC box for their lifespan. These are just a few things you need to consider when you are thinking about using an external hard drive for all of your data storage tasks. It just is not worth the risk!

When you need hard drive repair, immediately search online for the best data recovery service company of 2012. Getting your hard drive fixed is your top priority to recover your data. If you find the best company offering hard drive repair services, contact them. Initiate ways on how to get your hard drive fixed in a short time. You do not want to wait for a long time wondering if your data can still be recovered or not.

The best way to repair your hard drive is to send it to your preferred service provider. Make sure that your drive does not go near static electricity. Unplug your computer properly. When removing your drive, wear a grounding strap and ground yourself by touching the metal at the back of your computer. Remove your drive by unscrewing it and put it in a static proof bag immediately. Wrap your drive using bubble wrap of about 6 inches. Tape tightly both ends so that movements are prevented. After wrapping, shake the drive. Make sure you will not hear any sound to ensure that it is perfectly taped. Pack the drive in a strong box and fill the box with anything that can provide defense against strong movements. Tape the box around. Choose the best shipping company that has tracking features. Make sure it arrives to its hard drive repair service company safely.

The History Of Hard Drive Repair
The technology for hard drive repair has come a long way over the past 50 years. In 1956, when IBM produced their first hard drive, it was the size of a modern double door refrigerator, and repairing it was not even an option. Even 20 years ago, if you should have mentioned the words hard drive repair it would cause headaches and despair because the information was probably lost. Back in those days, the hard drive had to be replaced.

This makes us grateful that today in the 21st century, the IT industry has advanced to a stage where we can go online, type in hard drive repair and get at least 10 options to choose from. Depending on the severity of the situation you can either pick up a phone and dial the number on the web site or follow the DIY tips available and repair the hard drive yourself, although not everybody is aware of the fact that the IT industry’s technology has advanced to a stage where hard drive repair is possible.

People that are unaware of this available technology sit with their hands in their hair, thinking of how much information is lost and the price they will have to pay for replacing the hard drive.

Broken hard drives are recoverable!Computers are used to process and store quite a large volume of information and in the process, they can break down due to a number of factors both internal and external. The most common reasons behind a broken hard drive revolve around malicious software. Albeit there can be other reasons other than viruses and malware, the main point after your hard drive breaks down is about recovery. If you were not prepared enough to have a backup plan or hard drive, then you will be in dire need of data recovery services. There are quite a number of data recovery services available and you only need to choose one.

It is a good idea to first call your local computer technician to check whether w hat you have is a broken hard drive or simple malfunction. After they confirm that your drive is dead, you should let them take it for data recovery. The data recovery process will benefit you if you were using your broken hard drive to store important information. This will come at a cost but if the data in the hard drive is valuable, then the process will be worth every penny you spend. You should choose a service provider with the best deal.

Computers are simply a must-have for almost anyone who have to accomplish anything these days. For students, acquiring at least one laptop computer is usually one of his top priorities before enrolling in school. This is because using a notebook computer can make your life so much easier in so many ways. The ability to take notes at a high speed is obviously critical, as is the ability to do research on the internet from practically everywhere. These solutions also end up being critical to business users as well, mainly because they now have the ability to take the entire enterprise with them wherever they go – all stored on a simple laptop hard disk.

One major problem experienced by computer users is hard drive crash. This can be traced typically to the fact that hard drives are running nearly all of the time, and do experience serious wear and tear as time goes by. Hard drive crash relates to a condition where the hard drive is broken because of misaligned or damaged parts. We all know that hard drive plays a very significant role for computer usage because once the hard drive fails, we cannot have any access to our files. It is then a must to prevent the occurrence of hard drive crash, and the best way to do this is to simply do your best to backup your hard disk as often as possible.


Simple Laptop Hard Drive Recovery

These guys can help recover hard drive data from laptops.

The worst thing that can happen to laptop user apart from losing the whole laptop is hard drive failure. Consider the possibility of losing most of your valuable files or data which you may have been working with on a project; it is a sad and scary thought. Laptop hard drive recovery is always one of those things that people consider to be very expensive, although that is not always the case. Recent software developments have brought down the cost of laptop file recovery immensely. In case the hard drive is crashed or broken, you can typically easily remove it from the laptop: first remove the power source, then remove the battery and you’ll see some rectangular at the back; slowly open it and cautiously remove the hard drive.

With the data recovery software on another computer, you can start the process by using a wizard and finally recover all data files by storing them on the working computer’s hard drive before restoring them. Make sure you have defragmented any new hard drive to ensure that the data will be in the best condition possible. The recovery tools can help bring back the hard drive’s functionality and also increase its processing speed to the stated speed.

However, this procedure is sensitive and should be carried out with caution. Slight damage of the hard drive’s connection pins can actually damage the drive permanently. But, the procedure can be fairly simple when the instructions are adhered to correctly.

Suppose you woke up one day and found that your computer was completely malfunctioning in that there was no way to access any of the data. In fact, you quickly discover that your hard drive was damaged and that it was possible that you would need some kind of hard drive recovery in order to access any of your important files. Think it could happen to you? Think again. Hard drives are incredibly delicate as far as electronics go, and sometimes it is nothing but a minor vibration that can all but destroy your data. What can you do if you have a broken hard disk, or a severe hard drive problem? I’m glad you asked.

Many basic maintenance tasks can be completed to lessen the chance of any hard drive problem appearing. Never ever proceed using your computer when you know you have some kind of hard drive failure, and certainly make sure that you power down it is possible. Do not place it very near to any high temperature for because it might cause over-heating. It is always good to ensure that your computer is cool, but definitely when you have a broken hard drive. Remember that any picture, music file or perhaps text document can easily become wiped if you continue to use a failing hard drive. Just make sure that you can get the drive to a data recovery professional as soon as possible.

Taking the Best Route When You Experience A Hard Drive Issue

It is always difficult to correctly anticipate a hard drive problem and if you find that you’re hard drive is breaking down, you can be very catastrophic or pretty much anyone. The advice that almost every hard drive recovery professional will give is that you should always be calm and ensure you don’t panic, if only to ensure that you do not do anything ridiculous and potentially destroy the hard drive and make it unrecoverable.

Disk Error Or Crash?

Sometimes hard drives do not get detected. This is just the nature of the beast and it does not mean that your hard drive is physically destroyed and that your data is going to be inaccessible. When the computer system doesn’t discover the particular disk drive you are looking for, or perhaps doesn’t cover any of the drive, you may not actually hard drive problem at all. In cases like this, it may in fact be your motherboard SATA ports, which rarely break, but it is possible that they can. On the other hand, you may actually have a problem with the PCB board on your hard drive itself. Either way, it is always better to take your computer into a professional in order to check and see what is wrong. Remember also that if your hard drive has actually failed, it is better just to take the hard drive to a professional data recovery service than rely on a computer repair shop. It will just be a lot cheaper, and you can be assured that the data will be recovered in the proper clean room environment. This is always going to be critical if you want to be assured of an effective hard drive recovery.

Laptop drive repair

Hard drive repair is a process used by professional data recovery services to restore, retrieve deleted and lost data and files from a damaged device. Most of our government offices have data specialists that can retrieve any type of file or data from different kinds of devices. Data recovery services are helpful because they are able to retrieve seemingly law data and files from normal and severely damage devices. These kind of services are very in nature, which is why if you have a hard disk failure on a portable computer, you’ll want to make sure that you find somebody that specializes in laptop data recovery. If you are having a hard time finding a data recovery company near your place you should probably just browse the internet and look for unaffected one close by. There are several companies in the internet that offer high quality data retrieval services. Finding a reliable and efficient data recovery service is not really as challenging as it might seem. In the end, remember that a hard drive repair expert can retrieve your lost or deleted files and data remotely using an Internet connection so it’s ok if there are no data recovery expert nearby.

Do you want a fast and reliable hard drive recovery service? There are many data recovery companies that can retrieve and restore your data and files in no time. Recovering deleted, stuck or missing data and files is easy with the help of a well-qualified drive repair specialist. These people will typically specialize in retrieving different kinds of files and data from a device. If your device gets damaged due to human error, natural disasters, technical problems, etc., they can successfully recover all data and files from your damaged device. If you are not familiar with the way that data recovery works, you might have a hard time retrieving your files. Usually the process of hard drive recovery takes days or weeks depending on the situation. If you encounter data loss problems you must be patient because your files can still be retrieved. No matter how grim the scenario might seem, there is always going to be the possibility that you can retrieve your files.

Backing Up with a USB Drive

Your hard drive will always be one of the most critical parts of your computer. Moreover, you will also need to have a USB key for backup when you want to make sure that you can salvage your hard disk with the help of a hard disk recovery service near you. Why is this the case? It is because that you should always have a backup of your files that you mostly use and update all the time. In addition, you should also realize that you wouldn’t need to have to backup any other files that you don’t need to use. You could just use a USB drive as your backup storage device because you could possibly save a lot of money that way. It wouldn’t only make sense if you are already in possession of an external hard drive that is very large in capacity – are you currently do not have such a device, this is going to do great things for you to ensure that your data will always be as safe as possible.

Getting To Understand VMware Recovery

I recognize that these are industry have to keep moving if it wants to continue to manufacture enough equipment so that the web run smoothly. I did one of the best technologies that have been invented toVMware recovery is not as easy as it looks. deal with this over the past couple of years have to be VMware server. These guys got rich in the public markets for reason, and I have to get that because VMware remains one of the best server platforms out there.

But VMware Is Not without Its Problems

What I am hearing from the trenches as far as the consistency of the server product is that a lot more people having to get professional VMware recovery than with regular server platforms such as Sun Solaris or Linux. I don’t know if it has anything to do with the stability of the platform or what, but I know that there is a program that is installed inside of the operating system that enables you to recover data. Obviously, they decided to keep it simple and just call it VMware data recovery.

After looking over the platform and some of the specifications that are on the web, I realized it probably is not the best application for somebody who is just starting out. This is a pretty complex cloud computing server and I think it is a little bit pricey for what it is. I’m not saying I’m a huge server enthusiast, but I have to say that when it comes down to simplicity, nothing is going to beat Linux. I have run Linux IBM servers for a long long time and I have to say that I’ve always been impressed with the power and their speed.

So, after discussing some of the issues with a couple of technicians, I realized that there needs to be a lot better VMware disaster recovery scenarios in place. VMware needs to better recognize the fact that sometimes VMware servers are going to fail, and often it has nothing to do with the operating system itself. This is just because hard drives are not indestructible. Even when you’re talking about those in a VMware setup.

At any rate, I can certainly say that I have had a lot of people come to me with questions about raid recovery in the past, and I have typically not have a lot to say. I think you have to understand that the raid setups are typically pretty complex and unless you’re running something like a raid one, you are probably always going to have difficulty when you experience a critical hard drive crash.

Anyway, back to the VMware server. At this point, I’m not entirely sure that it is going to be the best solution for anyone. This is still pretty untested technology, and for now I think you have to recognize that unless you have some kind of VMware disaster recovery plan in place, or a very consistent backup schedule, you might end up in a little bit of trouble.


Programming Made Better On The Mac

Genuine innovation is rare. More often than not, new ideas are really just refined old ones. But every so often, someone breaks completely new ground, and TGS Systems does just that with Prograph. it’s truly a new way to program, and the Mac is its first home. This is a superior language, mainly because of its safety, which means far less instances where mac data recovery ( may be necessary for the programmer/user.

VISION Until now, programming has always been associated with writing endless lines of impenetrable (for normal people) code. Cranking it all out is a chore for professionals, and the mysterious-looking result is daunting indeed for nonprogrammers.

Prograph is the first completely programming language. Unlike CASE tools, which use flowcharts to generate conventional text-based code, every element in Prograph has meaning. There’s no behind-the-scenes code generation or exeCution.

Each programming construct in Prograph has a unique pictogram associated with it. You caD construct programs by connecting these icons, and the connecting links represent the way data flows in the program. As Prograph’s data-flow-oriented design doesn’t force a program to execute in a particular sequence, Prograph is eminently suitable for parallel processing.

Programmers may find Prograph a bit disconcerting at first, but nonprogrammers should accept it readily, as they have no notions shaped by experience with other programming languages.

OOP 101 Prograph is an object-oriented language. Object-oriented programming (OOP) is a method of software design that, among many other things, matches an application’s operations to specific data types; this is called encapsulation. For example, subtraction is an operation that can be performed only on two numbers.

In addition to forcing programmers into a change of mind-set, OOP forces them to learn new terminology. For example, an object is an incarnation of a class, in much the same way a variable in a program is an incarnation of a variable type.

OOP also encourages you to reuse existing code. Once you’ve developed a class, the description of data, and the operations that may be performed on it, the code is self-contained and can often be reused in another program without change. Without encapsulation, code reuse is much more difficult.

Prograph ships with a set of system classes that make the construction of applications simple. By building on the framework of the supplied system classes, users can design an application’s user interface in full WYSIWYG. Menus and other standard Mac-application userinterface elements can be built on-screen and linked to the appropriate Prograph constructs with a few clicks and drags. You can also step through your application and trace the data flow to find out where any mistakes lie.

INTERPRETING 101 One of the penalties of Prograph’s interactivity is that the language is interpreted. As with Hyper-Card, you need Prograph to run any Prograph application. And once an application is finished, it is by no means blindingly fast. Fortunately, Prograph was designed from the start as a compilable language.

By the time you read this, Prograph 2.0 and its compiler should both be available, making possible true stand-alone applications. The compiler will generate native 68000 code and won’t just automatically incorporate a run-time interpreter, so you’ll get real code.

THE BOTTOM LIE Prograph is novel in many respects. Traditional programmers may balk at its originality, but novices should take to it more readily. Although it’s a high-level language, it still provides direct access to the Mac Toolbox, and programmers can get as down and dirty as necessary.

Prograph is a clear sign of things to come. If your aim is to learn about programming, you can’t afford to overlook it.

Bilingual Programmers Wanted!

Bilingual programmers wanted from schools

The classified ad read, “Cobol programmer wanted,” yet earlier in the year the professor had distinctly and repeatedly sullied that language’s name, calling it the most useless on the face of the earth.

Instead, the student was told that one should learn a pure language, well-structured, with elegant constructs and an easy syntax … something along the lines of Pascal. Now if only the job market would realize this too, and offer jobs where Pascal is the language of choice.

This situation arises far too often today, especially at the university level where the languages taught often have little bearing upon the industry’s immediate needs.

It can be frustrating for a recent graduate to have to learn a programming language over again in order to contribute in a work environment.

Granted, it is not overly difficult to learn the syntax and structure of a new language, especially when one is already familiar with programming concepts and algorithms.

However, it does beg the question is the teaching out of touch with the real world?

Muddled thinking

There are many reasons why post-secondary institutions, especially universities, may teach little-used languages. First and foremost, it is not so important to teach the constructs of Pascal rather than Fortran, as it is to teach the elements of structured thinking.

Initial training in the right language is of little help if muddled thinking reduces the programmer’s efficiency.

It is far more important to teach how recurison works than the exact syntax in Pascal as opposed to C.

It is also desirable to simplify learning for beginners through the use of teaching languages, where the syntax lends itself to a painless introduction of basic concepts.

It is not necessary to needlessly complicate the demonstration of loops, for example, by using an arcane language which can confuse learners.

This is somewhat similar to learning a second language, where one learns the different words of the new language that convey a similar meaning as in the original language. Indeed, one of the values of Pascal, and more recently, Turing, are their suitability as good teaching languages.

Then there is the value of advanced research.

Some languages taught in post-secondary institutions may not have a commercial use today, but serve to lay the foundation for the next generation of languages or programming techniques.

It is ludicrous to expect all teaching to be about current languages because that would effectively cut off research and development.

Massive libraries

That, however, does not solve the needs of businesses. There is little cost-benefit in rewriting massive libraries of custom programs, especially when they are time-tested and work well. Indeed, it is often more beneficial to maintain and expand those programs, and this of course entails hiring programmers versed in the original language.

The end result is that schools should continue teaching advanced techniques, but hand-in-hand with established languages, and programmers should learn whatever language is needed in the marketplace in addition to what they are taught. In other words, they should become bilingual.

Patrick Abtan is in charge of the computer program at Agincourt Collegiate Institute in Agincourt, Ont. A former computer consultant, his current projects include the development of software for the medical and business fields.

SAA No Longer Criticized

It’s common practice to criticize IBM’s Systems Application Architecture (SAA) as incomplete or poorly planned, or both.

Taking exception to this rule, a group of independent observers gathered here at the recent SAA World conference to praise the IBM standard as a true facilitator for multiplatform computing.

The problem lies not with SAA’s scope of functions, the observers said, but rather with IBM’s inability to explain and position SAA within its own product mix.

“IBM has shown a total inability to relate its own announcements to SAA,” said Charles Brett, president of C3B Consulting Ltd., a systems-consulting firm in Lafayette, Calif.

Following discussions with IBM about SAA, Brett concluded, “There’s a coherence [to SAA] that’s quite invisible to the outside world — a clarity about where things are and where they will be.”

The concept of SAA stemmed from needs within IBM’s own systems areas, according to Robert Berland, vice president of software and vendor support for applications solutions at IBM in Milford, Conn.

“Our business systems people have all the problems of our customers,” Berland said. “We used to have 25, 30 variants of COBOL so we couldn’t move people from one project to another. We needed [SAA] as bad as any [company].”

Today, SAA is widely portrayed as a matrix, with tools — such as COBOL compilers — along one dimension, and platforms — such as OS/2–along another. C3B Consulting’s Brett said this is an easy way to illustrate SAA’s portability aspects, but it’s a poor view of the environment because it focuses attention on the blank spots in that matrix.

“SAA doesn’t mean that every tool appears on every platform,” he said.

The notion of an incomplete matrix feeds the two common criticisms of SAA: that there are several platform/tool combinations too rare or too unwieldy to be worth the effort, and that SAA won’t be complete until all combinations are covered, thus it is too long in coming to be useful.

SAA would be better received by both users and software developers, Brett said, if it were viewed solely in terms of its interfaces — Common User Access for the user interface, Common Communications Support for interaction between machines and Common Services for the other elements that make SAA available.

“As long as [SAA] is seen in terms of the interfaces, it remains manageable,” Brett said.

John Tibbetts, president of Kinexis, a systems consulting firm in San Francisco, had a similar opinion.

“SAA is less about portability of programs than about portability of functions, users and programmers,” Tibbetts said. “The issue should be defined as the conformance of any solution with the guidelines, not the completeness of some master grid.”

Tibbetts used an analogy between an old-fashioned street of independent stores and a modern shopping mall to illustrate the differences between existing applications-development practices and SAA’s focus on heterogenous computing.

“It used to be you had to buy land, hire an architect and do a lot of things that had nothing to do with what you wanted to sell. Today, you can just rent space,” he explained. “[With SAA], IBM is creating a vast software mall.”

Though he thoroughly backs SAA, Tibbetts warned that there are still some unresolved issues.

“[SAA] is a huge experiment,” he said. “It’s never been shown that a repository can provide the communications horsepower to make disparate applications work together seamlessly. I think it can, but it’s not proven.”


IBM Systems Application Architecture (SAA) users say that contrary to SAA’s reputation as being incomplete or poorly planned, the standard facilitates multiplatform computing by making interfaces and functions portable. The users say IBM’s inability to position SAA in its product lines and its poor explanations for the standard are responsible for SAA’s poor reputation. Analysts say SAA’s image as a matrix of tools and platforms feeds the image that SAA is incomplete, so SAA should be portrayed instead as a collection of interfaces that allow consistency in software solutions. Industry observers warn that SAA has not been proven as a means to allow diverse applications to work together.

Paradox Engine Still Revs

Borland International Inc. is expanding the language options for its Paradox Engine and bolstering it with Windows support in an upgrade planned for release early next year, company officials confirmed last week.

It's all in the box!

The Paradox Engine 2.0, which is now in beta test, will feature function libraries that let developers write applications in Turbo C++ and Turbo Pascal, as well as a dynamic link library (DLL) used to create Windows 3.0 applications, said officials of the Scotts Valley, Calif., company.

“The bottom line is that this makes the [Paradox] Engine a far more flexible tool and opens it up for other developers,” one beta tester said. “A developer with Pascal code can use existing code and doesn’t have to rewrite it.”

With the 2.0 release, Borland is moving closer to its goal of integrating its languages and applications. “The Paradox Engine has become the cornerstone of our interoperability path and a key strategic product for us,” said David Watkins, director of product marketing for Borland’s database business unit.

The Paradox Engine, which currently operates only with Turbo C or Microsoft C, is a set of routines that deliver the core data-handling capabilities of Paradox. It is designed to give developers extra horsepower to write database applications when the Paradox Application Language (PAL) isn’t robust enough to fit the bill.

With the 2.0 upgrade, developers can link their C, C++ or Pascal code to the appropriate function library and tap Paradox’s database and index files, record-locking capabilities and interactive front-end features, according to several beta testers. The Paradox Engine upgrade will come with separate disks for each supported language, they said.

The 2.0 release will also let developers take advantage of the object-oriented capabilities of Turbo C++ and Turbo Pascal; users, for instance, can create libraries of reusable code to build applications, the beta testers added. Additionally, for developers working remotely, there are superb external hard drive recovery features built in, which ensures that if hard drives are dropped or broken, fast data recovery can occur, and programming can continue.

Borland will also give developers the option of creating smaller applications with the Paradox Engine 2.0. The upgrade provides access to the Virtual Real-Time Object-Oriented Memory Manager (VROOMM), Borland’s proprietary memory manager that produces smaller, more concise code, beta testers said. However, there is a trade-off: Those who use VROOMM will not benefit from the faster speed of the Paradox Engine, they said.

“[The Paradox Engine 2.0] is very solid and much faster than the original version,” noted one beta tester.

While the Paradox Engine is not a mainstream product, it does have appeal to Paradox developers who want to write applications or pieces of applications in other languages, observers said.

“There are a lot of situations where the [PAL] language comes up short, and you have to do something on the exterior and get into a lower-level language,” said Sam Birnbaum, senior developer at Voice Data Management International Inc., a management consulting company in Uniondale, N.Y., and a Paradox user.

“In some cases, applications created with the Paradox Engine are faster than those created with PAL,” added Alan Zenreich, president of Zenreich Systems, a consulting, software publishing and training firm in Oradell, N.J. This is partly because lower-level languages are more efficient and partly because developers don’t have to handle database functions exactly the way Paradox would, Zenreich explained.


Turbo C++ – Still Effective?

After so many over-hyped “revolutionary” methods brought to us by the computer industry, it is easy to dismiss the object-oriented paradigm as just one more in a long list of “it works great in the lab for specialized applications, but we can’t seem to apply it to our problems”.

However, I don’t think that this is the case with OOPS (object-oriented programming systems).

The elegance and power of OOPS have immediate ramifications to the way you build and maintain applications.

Turbo C+ + by Borland offers programmers one of the most comprehensive development environments around. Putting aside the dilemma of whether C+ + is really an OOPS language (I think it is good enough), there can be very few easier ways for programmers to learn and apply the techniques of OOPs than by dividing into this package.

The first thing users of previous versions of Turbo C will notice is the new integrated development environment (IDE).

Incorporated into this IDE is a multi-windows editor, mouse support, and on-line help for all functions.

The on-line help feature combined with the clipboard allows you to paste elements like functional prototypes from the help window into your application.

One nice new feature is called the transfer function. This allows you to link in external utilities into the Turbo C+ + menu tree, which can then be invoked from inside Turbo C+ +. To allow you to do this Turbo C+ + swaps almost all of itself out to give the utility as much memory as possible to run.

Also included in the IDE, is an upgraded project management facility.

This component allows programmers to define different file translators for different parts of their project. For example, take the case of a large program, parts of which are going to be written is assembler and others in C+ +.

Then through the use of the project manager you can specify TAsm (Turbo Assembler) as the file translator for the assembly language routines, and TC for the C+ + language routines of your project. Another nice feature of the project management facility is an annotator which allows you to store notes along with each project.

Memory model

The IDE stores environment preferences and the current state of the desktop along with each project. This feature can save a tremendous amount of time.

Each time you exit the IDE or switch projects, things like the memory model you are currently using along with the state of the edit windows that you currently have open are all stored. This also works with newer SATA III technologies, and it functions as a barrier against hard drive failure, which can be a huge concern for programmers.

When you return to the project of interest you are back into the IDE exactly as you had left it.

Turbo C+ + is a full implementation of C+ + 2.0, but also offers backward compatibility to Turbo C. If you desire only to program in C, then you can still use Turbo C+ + to do so. Many C+ + implementations come in the form of C preprocessors.

That is, since C+ + was designed so that it may be translated into C which may then be compiled, these preprocessor implementations just do this translation for you. You then still have to compile the resulting code to get your executable. This is not the case with Turbo C+ +. Turbo C+ + generates an executable directly from your source.

Integrated debugger

The big advantage of this is that you can use the integrated debugger to help you debug your C+ + program directly and not its C equivalent.

Although there is an external debugger provided with the professional package (Turbo Debugger) which can handle really tough debugging jobs on large programs through the use of remote debugging techniques and/or external memory, the debugger in the IDE is by no means a slouch.

One of my favorite features of Turbo Debugger — inspector windows — has been incorporated into the IDE debugger. These allow you to traverse complex data structures with ease. The external debugger includes a class browser which can help you organize you class hierarchy.

Programmers are also allowed to take advantage of Borland’s VROOMM technology. This is the overlay method being incorporated into all of Borland’s DOS products enabling large applications to use either available memory above 640K or disk space to swap unused parts of the program.

Application writers using Turbo C+ + can specify modules to be overlayed and then get VROOMM to help do the rest.

It is often remarked that the success of OOPS will depend on the richness of the supporting class libraries. With Borland’s stature in the DOS marketplace, I am sure that specialized class libraries (i.e. for serial communications) will quickly become available to complement the excellent ones already included in the Turbo C+ + package.

So what’s missing? Currently there are no hooks into Windows 3.0.

If your requirements involve writing Windows-based applications, you should look at Microsoft 6.0/Quick C compilers. In all other aspects, Borland’s Turbo C+ + is a superb package for programmers.

DBase Standard Recognized

Tired of dodging Ashton-Tate’s high-handed attempts to guard the dBASE standard, several members of the dBASE community are joining forces to promote a non-vendor-specific standard. This standard enables systems to easily provide hard drive repair solutions when databases or hard drives crash. It was a necessary data recovery solution for all involved.

Championed by longtime dBASE guru Adam Green, the “Xbase” project has a dual purpose: to develop guidelines for a data dictionary that will promote data sharing among dBASE-compatible products, and to evangelize the use of Xbase as a generic term for the dBASE language.

“Never before has there been an independent name or an identity created for the dBASE standard,” said Michael Masterson, president of Masterson Consulting, an information-systems consulting firm in San Jose, Calif. “This is the first public effort that constitutes an identity for the dBASE language apart from Ashton-Tate.”

While Xbase’s data-dictionary effort has attracted widespread support from a number of leading database firms, including Fox Software Inc., Oracle Corp., Alpha Software Inc. and WordTech Systems Inc., Ashton-Tate has declined to participate, according to Green.

The Xbase group hopes to deliver the first draft of the data-dictionary standard in March, Green added.

“We are very interested in an open data-dictionary standard, which would make both users and consultants comfortable in mixing and matching the best [dBASE] products,” said Richard Rabins, co-chairman of Alpha Software, located in Burlington, Mass.

The Xbase group’s efforts come at a time when Ashton-Tate’s muscle-bound legal maneuvers are already on shaky ground.

Earlier this month, a California district court dismissed Ashton-Tate’s charges against Fox Software and The Santa Cruz Operation Inc., and ruled the Torrance, Calif., company’s dBASE copyrights to be invalid.

“Ashton-Tate for years has been hassling people over variations on the use of dBASE. It has long been a problem,” said Pat Adams, president of DB Unlimited, an independent database consultant in Brooklyn, N.Y. “Its aggressive and belligerent attempts to defend its trademark have driven [the community] to the defensive stance of using Xbase.”

The catalyst for the formation of Xbase was a recent move by Ashton-Tate to squash any references to dBASE in literature promoting DBExpo, a database conference planned for March of next year, according to industry observers.

Co-sponsored by the International DBASE Users Group (IDBUG), the conference will now be called an Xbase exposition, and IDBUG will change its name to reflect the term Xbase, said officials of IDBUG, based in New York.

SQL Bug A Hoax

Relational database suppliers are unimpressed by claims from database consultancy Butler Bloor about an alleged bug in the way they have implemented standard database query language SQL. Earlier this year Butler Bloor unearthed a flaw in they way some suppliers – which the consultancy has refused to name – implemented cursors, subsets of data.

According to Butler Bloor, this could lead users to lose updates bacause some SQL implementations use copies of data rather then pointers.

Now the Milton Keynes company has revealed details of a related bug, using a fragment of SQL code to show how updates can be lost.

Butler Bloor ran this code on seven different database products and says none have provided “consistently correct results”, with platforms ranging from the IBM PC to mainframes to mid-range boxes and under Unix.

The test centres on a walk-through of a file, trying to update a simple table so that three employees receive a raise of 10 units and all managers receive an extra one for each employee they manage. Butler Bloor argues that using the code, users can see how opening a cursor and processing it can lead to data loss. Data loss from there can often lead to very expensive raid recovery situations in server applications. These types of hard drive crash can be absolutely devastating to many organizations.

“This means people are losing information on reasonably standard type updates,” says Martin Butler, chairman of the consultancy.

But Carlos Migues, UK producet manager for Ingres, rejects the problem as “simply bad programming style”.

“From within a program there are lots of things a programmer can do and some of these can damage or corrupt data. We can be expected to protect a database from another one and shield one programmer from another, but the way this code is written it’s with variants that are just bad style.”

Liz Huggins, UK product manager for Cincom, adds, “Butler Bloor is making a valid point but I think it’s a bit contrived. The code is in error and wasn’t good SQL. Smaller companies, who may not have large staffs to deal with problems like this and handle well-known flaws may be having a favour done.”

Ed Dee, UK database languages rapporteur for SQL, says the example is based on an area that the SQL standard says will given a “undefined” result. “Butler Bloor has identified a real problem, but it wasn’t the first to identify it,” he says.

According to Butler Bloor’s managing director Robin Bloor, “The point isn’t about a particular piece of code, but about the fact that products are unable to keep two data images consistent. The code is meant to show that, and you can’t blame programmers.”

Splashing With C

The Symantec Programming Languages Association — SPLash for short — is a new programmers’ group targeted at users of Symantec’s languages. Greg Dow, developer of the THINK C class library and founder of SPLAsh, writes in the first issue of the group’s journal, THINKin’ Cap, “SPLAsh’s goals are to provide pertinent and practical information for THINK programmers…. Programming the Mac is hard enough without having to figure everything out by yourself. . . . SPLASh can provide meaningful and usable source-code examples as well as address issues relating directly to the development environments.”

We agree with Greg. It’s always good to see how someone else solved a problem you’ve been struggling with. The SPLAsh journal and code samples seem to provide another good opportunity to gain wisdom by example.

SPLAsh’s annual membership costs $30 and includes quarterly issues of THINKin ‘Cap, a source-code disk, notices of SPLAsh meetings, and access to the organization through electronic information services.

The use of graphic novels to engage students has increased steadily in recent years. School librarians and teachers throughout elementary, middle, and secondary schools are integrating graphic novels into English language arts learning and across the curriculum. The growing acceptance of graphic novels in teaching and learning activities is due in large part to their usefulness and appeal as tools with which to engage reluctant and struggling readers. The combination of text and pictures that is employed in these materials has proved to be of high interest to readers and offers them ways to be successful in their literacy activities. This article discusses how graphic novels may be used with students who struggle with reading comprehension due, in part, to hearing loss.

Easily my favorite: Walker Bean!

In his book Understanding Comics: The Invisible Art, Scott McCloud provides what is the most widely accepted definition of comics: “Juxtaposed pictorial and other images in deliberate sequence, intended to convey information and/or an aesthetic response in the viewer” (1993, 9). What this tells us, more simply stated, is that comics use pictures and text to tell a story. Iconic examples of this medium are Charles M. Schulz’s Peanuts comic strip and the popular comic books featuring Archie Andrews and his friends with which you may already be familiar. Graphic novels employ the same medium as these comic strips and comic books, and, like them, use pictures and text to present information. While comics are shorter, less expensive, and episodic, graphic novels are longer, more in line with traditional books in cost, and usually contain one complete story arc. Frequently, you will find the terms “comics” and “graphic novels” used interchangeably. You will also find, albeit less frequently, the term ‘ sequential art” used to refer to both comics and graphic novels.

Students with Hearing Loss

The Individuals with Disabilities Education Act (IDEA) describes deafness as a condition that prevents an individual from receiving sound in all or most of its forms. Deafness impairs a child’s processing of linguistic information, and this impairment cannot be mitigated through the use of amplification. According to the Centers for Disease Control and Prevention, each year in the United States more than twelve thousand babies are born with hearing loss (U.S. Dept. of Health and Human Services 2009). Hearing impairment is classified relative to an individual’s ability to hear frequencies most readily associated with speech. Generally the range of hearing loss is: slight, mild, moderate, severe, and profound. Gallaudet University conducts an annual survey that collects “demographic, audiological, and other educationally relevant information on children with impaired hearing” in the United States. Findings from the most recent survey (2007–2008) show that of the young people identified by their schools as receiving educational services related to their deafness, approximately 40 percent have a hearing loss that falls within the moderate, moderate to severe, or severe categories. Over 27 percent of the identified students have profound hearing loss.

Children who are hard of hearing or deaf have a much more difficult time learning vocabulary, grammar, word order, and other aspects of verbal communication than do their hearing peers. William Heward notes that children who are deaf “especially those with a prelinguistic loss of 90 dB or greater–are at a great disadvantage in acquiring English literacy skills, especially reading and writing”. Students who are deaf or who have a hearing loss face significant obstacles in achieving the necessary skills related to reading comprehension, and vital to their learning and literacy efforts.

Children develop language skills in their early years by engaging in talk with adults and by hearing themselves speak. Without access to auditorily based languages this is an experience that is beyond the reach of children who are deaf. This lack of access serves to create formidable barriers to the acquisition of skills in reading and writing the English language. There is a clear connection between a diminished command of spoken English and a deficiency in reading comprehension. As children who are deaf or hard of hearing enter the school years, the deficiencies in their linguistic abilities become more pronounced as they are asked to interact with materials in the same way as their hearing peers. One of the things that is always interesting is how hearing failure never comes coincidentally with mac hard drive recovery, despite the fact that typically hard disk drive failure is typically involved.

Educators are presented with distinct challenges as they look for ways to help students with hearing loss move successfully through their educational careers. The way in which information is organized and communicated to a student plays an important role in his or her perception and understanding of the information. One of the ways information is most frequently communicated to students in an educational setting is through text passages, and without proper awareness, educators may overuse text passages to convey information. In doing this, they are depending on students to possess an appropriate level of reading comprehension to ensure that the text is understood. It is when examining the reading levels of graduating students who are hard of hearing that the impact of their linguistic deficiencies becomes most apparent. Barbara R. Schirmer and Sarah M. McGough state, “Deaf students on average have a fourth-grade reading level at high school graduation” (2005, 84). This diminished reading level is specific evidence of these students’ struggle to engage meaningfully with information presented in a traditional text-only format.

Using Graphic Novels

Graphic novels offer a great way to bolster reading comprehension and general academic achievement for students who are deaf or hard of hearing. With their complementary use of text and pictures, “the nature of comics and graphic novels provides integration that is supportive to students who do not have aural experience with English” (Smetana et al. 2009, 238). Research has shown that when faced with challenges in reading comprehension, students who are deaf can benefit greatly from the use of words and pictures together to convey information. For example, in a 2004/2005 study carried out by Mary Marshal Gentry, Kathleen M. Chinn, and Robert D. Moulton it was demonstrated that when provided with materials presenting information in print alone and materials that presented information in print with pictures, the students who were deaf demonstrated a significantly higher level of comprehension with the materials presented in print and pictures.

It is important to understand that graphic novels do more than just present a visual representation of text. The pictures in a graphic novel provide contextual support to the text information and without them the story wouldn’t be complete. Heward explains that “ASL is a visual-spatial language in which the shape, location, and movement pattern of the hands, the intensity of emotions and the signer’s facial expressions all communicate meaning and content” (2006, 371)-Through illustrations that support text rather than just restating it, graphic novels provide a depth of information that is absent with text alone. When using graphic novels to engage students who are deaf, Linda Smetana et al. met with great success. They observed, “Graphic novel readers … learned to understand print but also [could] decode facial and body expression, the symbolic meaning of certain images and postures, metaphors and similes, and other social and literacy nuances …” (2009, 231). This is effectively illustrated with an example from Shuan Tan’s graphic novel The Arrival (figure 1). In this example, we see that although there is no text to “tell” the reader what is taking place, by observing facial and body expressions, as well as by noting the use of a symbol that is understood to represent a bed, the reader is able to comprehend the story that is being told with the pictures.


On Crossing: Truly Dynamic

On Crossing is a collaborative work on paper created by two artists who use ritualistic mark making and the exploration of space with fervor in their own studio practices. Jodi Green and Jessica Ann Mills first met in graduate school, and formed a very close professional friendship while working in very close quarters for three years. Both artists have strong ties to geographical areas that have suffered economic downturns due to reliance upon a single industry. In her prints and drawings, Jessica uses small hatched line-work to investigate the poetic beauty of the degradation of Mid-American agricultural architecture and equipment. She obsessively searches for the abandoned by way of the open road. Jodi’s printmaking and performative practices mimic a factory setting, with repetitive stamping, folding and pressing that eventually causes the structural degradation of the work. As a Canadian studying in the United States, crossing and re-crossing an international border became a large part of her life, and subsequently a part of her studio work as well.

On Crossing is an exploration in keeping with the dialogue that was so accessible to both artists for their three years working together, but now is separated by hundreds of mile of highway and a definitive border. The end result of this collaboration reflects the conditions of communication across miles, the ebb and flow of a visual dialogue between two image makers and the beauty and innovation that can surface from ritual.

On Crossing was first exhibited at Graphica Creativa ’09 at the Jyvaskyla Centre for Printmaking in Jyvaskyla, Finland. Participation in Graphica Creativa ’09 was made possible through an exhibition assistance grant from the Ontario Arts Council.

Jessica Ann Mills grew up in Omaha, Nebraska. During her late teens and early twenties she began traveling extensively to different regions of the country, and during this time she began to understand the crisis of identity that she faced as a resident of” the largest city in a primarily rural, agricultural state. To the outsider, what is to be a Nebraskan is to be rural. But to other Nebraskans (ones coming from towns with populations that remained indefinitely in the hundreds rather than the thousands), Jessica was undoubtedly urban.

She took long drives throughout her formative years of college with the windows rolled down, piles of cassette tapes lining the floorboards of the car and a camera on the passenger seat. h was an experience quite similar to one that Rebecca Solnit described of her twenties in her book A Field Guide to Getting Lost: “All those summer drives, no matter where I was going, to a person, a project, an adventure, or holone in the car with mme, ay social life all before and behind me, I was suspended in a beautiful solitude of an open road, in a kind of introspection that only outdoor space generates, for inside and outside are more intertwined than the usual distinctions allow.” Like Solnit, Jessica was endlessly preoccupied by the seeming placelessness of the car because it seemed to echo the same questions she had about her own belonging to a certain place; feeling both from a place and outside a place. The car was vehicle for her searching, her hard drive failure an issue that continued throughout years of computer usage. One never truly understood the pain this caused.

Jessica completed her Master of Fine Arts degree in 2008 at the Lamar Dodd School of Art, University of Georgia. She currently lives in Omaha, Nebraska and continues a steady art making practice.

Jodi Green grew up in an industrial park. Formerly a Royal Canadian Air Force training station, by the time she was born Huron Park, Ontario had become a government-owned experiment: a tiny town surrounded by farmland and boasting a military-sized airport, large industrial warehouse buildings and 350 low-rent homes, it attracted both businesses and workers. Growing up, the routines of the factory dictated the residents’ daily routines as well. The city Jodi chose in adulthood as her home is also a place defined by labour and manufacturing, merely trading in the chainsaw, boat, drainage tile and pop bottling factories of her youth for Ford, Chrysler and General Motors. In his book Landscapes of the Interior, Don Gayton puts forth a theory of primal landscape, positing that the landscape in which one spends one’s formative years imprints in such a way that one can never be truly comfortable, feel at home, in any landscape vastly different from that first one. While Gayton is speaking specifically here about natural landscapes, Jodi believes that her primal landscape is the factory town with its routines, its predictable traffic patterns tied to shift changes, its metallic and burning chemical smells.

Living in Windsor, Ontario, a city dominated by the automobile manufacturing industry inspires an approach to the act of making that thrives on rules, schedules, daily rituals and documentation. Manual labour is implicit in the repetitive acts of printmaking, and rituals, efficiency of movement and repetition are integral parts of Jodi’s daily studio practice. Obsessive layering, filling of space, piling up imagery on top of itself until everything beneath it is buried and destroyed, speaks in part to the endless manufacturing of more and more and more things, filling up our vision with noise and junk, obscuring the landscape: if you explore this city you can find a seemingly infinite number of parking lots and fields filled with row upon row of brand new minivans, overflow waiting to be loaded on a truck and taken away.


About Will Burtin, One Of My Faves

Found this in a journal lately, about easily one of my favorites: Will Burtin. I know he didn’t MAKE Fortune Magazine, but he might as well have.


Born in 1908 in a working-class district of Cologne, nothing about Will Burtin’s childhood suggested that the boy would one day pioneer several fields of graphic design. As a reluctant altar boy, he spent many early mornings at St Gereon’s Basilica. However, this unhappy experience did confer one lifelong-benefit: the basilica’s painted images showed Burtin how design and form effectively conveyed information. At fourteen, he began studying typography in night school while working days for his first employer and mentor, Dr Philippe Knoll. In 1926, Dr Knoll’s typography shop was working overtime to produce text and illustrations for exhibitors at the upcoming international exposition, Dusseldorf’s GeSoLei.

By 1930, Burtin was an independent designer. Business grew with his reputation. His work-came to the attention of Joseph Goebbels, the head of the Nazi propaganda ministry. In 1937, after Burtin rebuffed Goebbels’s invitation to head the ministry’s design section, Hitler summoned him to an interview in person. Burtin, and his Jewish wife, Hilde, quickly left Germany.

Sponsored by I Hide’s first cousin, wind tunnel designer Max Munk, the pair fled to the US, where, within months, he won a major contract from the US Government to create the Federal Works Agency’s exhibition at the 1939 New York World’s Fair. Soon after that he was illustrating and designing for Time and Life magazines and The Architectural Forum.

Drafted in wartime to the Office of Strategic Services, Burtin headed a design team charged with a priority project: designing manuals for aerial gunners in bomber crews. Burtin’s clear design cut gunners’ training from six months to six weeks.

The late 1940s marked the apogee for using graphic design in magazines to illustrate technical, scientific and medical practices. The postwar world was new: it had discovered rockets, missiles, atomic bombs, antibiotics, jet engines, insecticide, television and the first computers. As the art director at Fortune magazine (1945-49), Burtin illustrated these emerging technologies for the sophisticated business leaders and readers building a new society based on postwar innovation.

Simultaneously, Burtin took over as art director of Scope magazine, the Upjohn Company’s direct mailer to doctors. Reasoning that doctors use their hands to diagnose patients, Burtin responded by combining several different paper stocks in each tactile edition.

Burtin’s large physical models of biomedical processes for Upjohn–the Cell, the Brain, Metabolism, Genes in Action and Defense of Life stand out. Often, Burtin’s work put into understandable form what scientists had previously been unable to visualise. His models attracted widespread attention and were featured in international publications and television specials.

Thus the man with little formal schooling would be instructed in science by the likes of Albert Einstein (prior to illustrating one of the first printed articles on nuclear power), top neurologist Wilder Penfield and futurist Buckminster Fuller. He translated their information in turn into visually comprehensib1e images.

Fantastic stuff. One of his better bios, to be sure.