The growth in global industries, particularly in finance, transportation, and manufacturing, has presented corporations with great challenges in setting up successful communication links. Those challenges must be met if companies are to improve the management of their international operations and provide enhanced services. Setting up an international network, however, is not easy. It is essential that corporations understand the golden rules of global networking, since international business is fast becoming a networking game. The network budget for multinationales represents a major investment of strategic importance. The decision to set up a global network must rest on solid business reasons; otherwise, there is no point to the strategy. Those business reasons can be identified by working out what the network will be used for, how the applications can benefit the corporation, and where the company is headed. The installation of each new access node must be based on an analysis of cost versus anticipated traffic.
In most firms, initially, the net will connect international sites and offices. Ultimately, it may link these sites to the firm's trading partners-agents, suppliers, and customers. That means it has to be a true multi-user network supporting different computing hardware and carrying a wide range of information, with a great diversity of transmission sizes, transmission frequency, and «value» or «importance» of content.
The underlying applications will be equally diverse, but fall into three main categories:
• Batch transfers between major sites. This usually includes financial reports, orders
from distribution sites to factories or central warehouses, CAD and similar design
information from design offices to factories, and feedback on sales, product quality, and
customer and supplier performance. In each case, the source and destination of the
material are fixed. Volumes tend to be high and transfers are regular, often scheduled.
• Interactive consultation with central databases or application programs. This
includes user requests for information, perhaps on prices or stock quantities or shipping
schedules, updating central files to record new sales and orders, the completion of safety
checks, and so on.
• Messaging systems. This includes electronic mail for direct communication between people at different sites, often in different countries. This type of application will also cover electronic data interchange (EDI), which links the company and its trading partners.
The first generation of client/server application development tools marketed over the past five years offered graphical rapid application development on PCs. Development and business professionals using these tools focus on what they see on the screen rather than on the complexity underneath. And because the GUIs of these tools are quite sophisticated, developers assume that the tools are also sophisticated enough to handle other aspects of the application, such as complex business logic and data management. Unfortunately this assumption is invalid. The sophistication of a first-generation toll's GUI represents only the tip of the iceberg: that sophistication does not extend below the water line.
Developers have found this out the hard way. If they try to make a first-generation tool handle the requirements of increased-scale applications, they usually have to program in one or more third-generation languages.
Also most first generation-tools tightly integrate user interface code with application logic, requiring that all the logic reside on the client. If that logic involves data-access, the scalability that would be provided by moving that logic to the server becomes almost impossible. This integration also forces all data to be moved across the network between client and server, resulting in a network bottleneck.
Finally because they require the use of proprietary extensions and languages, first-generation tools force the creation of applications in which the data management logic is tied to a single database; if the application needs to access another database, the developer must modify the application logic.
The maturation of the application development has resulted in a second generation of client/server development tools that not only accomadate increasing levels of complexity, but also result in increased programmer productivity across the user interface, logic and data management.
These tools also include sophisticated services to handle the development process which typically involves teams of developers. Also the tool's deployment capabilities accommodate a range of crossplatform enablers at the interface, middleware and networking levels. To move corporate-wide development from the mainframe to client/server requires the same tools and capabilities that were an integral part host development. These tools have recently begun to appear on the market to help organizations in their migration to corporate wide client/server application development.
When selecting second-generation tools for large scale application development, organizations need to consider their development capabilities, deployment capabilities and the flexibility of the tool to adapt to changes over time.
For example, development tools used to build corporate-wide applications must support complexities across the user interface, logic and data components of the application, hi terms of deployment, the tools must be able to handle the issues presented by multiple computing platforms, increasing numbers of users, and multiple databases.
The tool must also provide flexibility over time. Products, services and business units continually evolve; development and deployment environments must mirror and support these changes. Today's departmental application may evolve into an enterprise-wide application tomorrow. One large department may be divided into two business units and become geographically split, possibly internationally. Second-generation client/server development environments are designed to handle such changes.
Computer graphics – that eye-catching software used in movie special effects, car design and wind-tunnel simulation – is fast becoming a key battleground in the competition between Unix workstation and PCs.
Users agree that the two platforms are converging. But they are betting that Unix workstations will continue to offer better perforance, although with a higher price. Cost is one factor pushing graphics to low-cost PC platforms, and the sheer ubiquity of PCs is another.
PCs are even taking their place on research's desks. We are in a transition period but people seem to be more excited about PCs. Now, the researchers will start programming visual applications on PCs and test driving PC peripherals.
Yet even PC advocates don't think the transition to PC graphics platforms will happen overnight - or remove the need for high – end systems.
Software designers agree on the fact that specialized needs will always require specialized hardware. The problem is, that the speciality market of today will be the consumer market of tomorrow.
The gap between Unix workstations and PCs will narrow further users and vendors agree. Until recently, however, most visualization and three-dimensional graphics software made its debut on Unix systems.
Software designers agree on the fact that specialized needs will always require specialized hardware. The problem is that the speciality market of today will be the consumer market of tomorrow.
The gap between Unix workstations and PCs will narrow further and vendors agree. Unit recently, however, most visualization and three-dimensional graphics software made its debut on Unix systems.
But the market is driven by what the customers want and therefore computer graphics are drifting more and more into the use of PCs. Still PC users should prepare to add memory, video cards and peripherals to get the most out of graphics packages.
The hidden cost of PCs may also be the amount of time it will take to configure them with all the options for graphics. And some high-end graphics packages can cost just as much as a Unix workstation.