Schwab CIO Mark Barmann Discusses Firm's 5-Year Overhaul Plans

INTERVIEW

Charles Schwab & Co.'s Mark Barmann is leading a five-year project to re-engineer the discount brokerage's systems in coordination with its strategic growth plans into the mid- 1990s.

Barmann is executive vice president and chief information officer of a company where approximately 20 percent of the annual expense budget is dedicated to information systems.

He has been with Schwab since 1988, prior to which he was president and chief executive officer of First Interstate Services Co., a unit of First Interstate Bank.

TST

: You're in the middle of a five-year planned downsize move off your mainframes. Why?

Barmann: We don't call it downsizing, but yes. Schwab has made its mark as a discount brokerage outfit, it's specialized in efficient transaction handling and many say we helped create the industry that now we dominate. Our growth ambitions, however, exceed that. We are broadening and expanding our product line to include our own mutual funds and an increasing number of others' mutual funds. We're beginning a wide variety of fixed-income products. So we are going for more share of wallet, of net worth, of balance sheet. We have a five-year strategic plan in pursuit of that growth, which calls for enrichment of our product offerings. Our technology, our order-entry system is more appropriate to what we have been than what we intend to be.

And we've set out on a five-year program to restructure that information system, to support what we seek to become, while retaining what's important about where we've been. Our current system is a mid-1970 vintage design, installed here about 12 years ago. It's based on a classic IBM mainframe single system image model: MBS, CICS, Datacom. It's stamped right from the IBM mold. It can only run in one processor, it can't run in multiple processors, that's what I mean by a single image system. Therefore, we've been constrained. We've been guided by the introduction of ever larger mainframes: an expensive and sometimes risky proposition, as our business has at times grown faster than IBM's introduction of every yet more powerful machine. And those machines consistently have cost in the vicinity of $100,000 to $125,000 per unit of computing MIPS [million instructions per second]. And we think that's an increasingly cost-ineffective way versus a number of alternatives.

So two years ago or so we kicked off SAMS: Schwab Application and Migration Strategy. It defines our future as more characterized by distributed processing, not by centralized processing. It calls for three tiers: an upper tier in which the mainframe serves as a gigantic server to a large library of largely nontransactional data; a middle tier of a networked group of distributed processors, probably UNIX, spread across the country -- that's really the transactional workhorse of the firm; and, of course, the client/server models in our various offices and telemarketing centers. Three tiers: upper, middle, the lower tier being the client/server.

We envision data being resident in all three tiers; data of a different nature resident in a different tier, to respond to the retrieval requirements of people connected to the various tiers.

TST

: So when you say that the mainframe will be serving nontransaction oriented business, is that client account information?

Barmann: In part. It's also information on our broad segments, on the metrics of the buying patterns of our customers, on the profitability contribution of our products, and of course, all our financial systems. I hesitate to call it archive information, that's not quite right, but it's not all of the information that handles the staccato kind of transactions that should be satisfied by data elsewhere in the office.

TST

: How would the client account information that's used on a day-to-day basis be distributed?

Barmann: We haven't decided exactly where the location of the data will be and it'll probably change over time.

Our game plan is a little bit like a quarterback throwing to an end who hasn't yet made his pattern. What we'd like, though, is a virtual database, a seamless, logically integrated, physically disbursed database that no one cares -- neither the programmer nor the Schwab representative -- where the data is, that is just scattered across three tiers. It has multiple joins and multiple calls are made to various database responses.

We'll get some experience with where we locate this data and when we look at the number of accesses of that data, we will be able to dynamically decide where portions of customer account or other data should be located across that. So the data will tell us something about itself, about its frequency of calls and we'll make adjustments with the data. But the notion is of something not now available: this logically integrated, physically disbursed database across three tiers, seamless to most parties involved.

TST

: What are some of the tasks that you envision?

Barmann: Trade routing, interest calculations, electronic mail. Right now the only tool we have is a hammer and therefore all opportunities are nails. So in our CICS system we have everything from vital trade processing -- which is of course the very nerve of our operation -- to E-mail, to the names and addresses of our branches and brokers. We'd like specialized processors so we can distinguish reliability, cost, response time. Right now, when everything is buried behind a single processor, we only have that hammer to satisfy all our nails, all jobs. So I envision that we'd have different processors, each relegated to a different subset of our functions; some of them larger than others, certainly, but each tailored to the job at hand. So we'd have a full array of carpentry tools when we are finished.

TST

: Are you now evaluating hardware platforms?

Barmann: Yes we are and we want to defer that decision for as long as we can. We were gratified to see IBM's announcement last month of the new level of the RISC/UNIX processor at a new, very attractive price point. We expect that that trend will continue.

We are doing proof of concepts on RS/6000s today, but we are not wedded to that. In fact, we recently joined the Open Systems Foundation. We are very supportive of as open an architecture as we can strive for. Our conceptual thinking is that we should be able to snap in and out hardware components and software components that are OSF-compliant to suit our needs.

We are also looking at HP, we have discussions with DEC, we're keeping our options open, we're prototyping on the RS/6000, we think that's reasonable; IBM has shown a refreshing willingness to work with us.

I understand Quotron [Systems Inc.] is looking at the same potential platform. They're now a major quote supplier to us; they may or may not remain to be that but if there's some synergy in their work and ours we'll be delighted.

We're working closely with Seer Technologies [Inc.], the First Boston [Corp.] spin-off from whom we license HPS, a major CASE [computer-assisted software engineering] tool, which will have the ability to create execution time code, either on a PS/2 platform or on a UNIX platform.

TST

: How will that change the way Schwab systems are developed?

Barmann: All the time I've been in data processing, people have automated business processes as they found them and come upon them in the enterprise. That is, they've applied efficiencies of automation onto existing work flows. And they've done so sequentially around firms. I don't mean just brokerage firms. Any firm. A computer has really been applied against the work flow as analysts have come upon it that particular day. In many cases, those work flows are accidental and change over time. And yet automation has a way of fossilizing those work flows; perversely, while it's made for much more efficiency, it made changing them much more difficult. So there's a certain inflexibility that accompanies the efficiency that automation has brought over the past 25 years.

A lot of the disenchantment about IS over the last few years is the questionable value it has brought for the billions of dollars invested in it. In my opinion it's this realization [that] with the efficiency has come another side: fossilization, freezing a still-life picture of work processes that now are undergoing re-examination as the companies re-engineer and downsize.

Having said that, our approach is to model the business in terms of the data that underlies it. So we look beyond the empirically observable and we model what lays beneath: the interplay among the data elements, entities and relationships that underlie the entire firm. This ignores in large part the arrangement of work and gets down below to the data that work's laid on; a more steerable, a more mutable basis on which to design systems.

And there we get at the redundancy, the repetition, the multiplicity of like functions scattered in different places around the firm. It gives us the ability to decompose that activity and recompose it in ways in which it can be reusable across the firm. But we can look through the veil of current work flow and get at the more fundamental truths about the business the firm does by looking at the data on which it operates.

We see some opportunities to rearrange work groups, even the overall organization of the firm that might be more amenable to what we want in models, because there are certain revelations that arise out of the model building that suggest some inefficiencies in the way we are organized. We haven't decided what those are yet, we're not that far enough along yet.

TST

: What kinds of data are you looking at?

Barmann: Transaction data, account data, there are a few thousand data elements. I'm talking about the fundamental building blocks with any business. But we carry a lot of data that's derived, that's a combination of one or more pieces of atomistic data. You'd be surprised how relatively few elemental pieces of data are necessary around the brokerage business.

TST

: Can you give one specific example to illustrate how you've used data modeling?

Barmann: Well let me give you a trivial example -- interest calculation. I know there are people who do interest calculations around the firm, the margin department, the finance division, the people who work with our Schwab One account. Our current systems have multiple interest calculation margins, each individually handcrafted. Our data modeling identifies the opportunity to develop a single interest calculation margin that might be used by a number of different groups around the firm.

This notion of reusability -- some people call them software integrated circuits, software ICs and I think in the analogy of the computer business that's quite apt. In the old days, we would call them sub-routines. We might call them software ICs that would be built once and could be reused exactly as they are. So when we build new products and services, what we would do is to go to our partially assembled inventory of business objects or rules or set routines, and we'd piece in interest calculations with a check-writing capability, with a particular format of a report. And we can speed the product development cycle by selecting combinations of reusable business elements (rather than building a product fresh from the ground up). And then the organization might be re-fashioned, re-built around our products or our segments or these functional cores that we come upon through the model.

And we find that the overall cost effectiveness comes through the use of models and the reusability in the design of atomistic parts of our business. The product development cycle should be speeded greatly as the developers will be using semi-assembled components. Testing should be easier, as should the overall development cycle. Right now, we face product enhancement and development cycles 9 to 12 to 15 months from conception to inception. We'd like to shrink that to weeks and we think we can do that with a mature system, sometime in the mid-1990s.

TST

: It sounds like you're talking about object-oriented programming?

Barmann: Well, not in the sense in which the literature talks about it. There is an object orientation to what we are doing but it's not the pure object-oriented programming that is just coming into the market. It's an important distinction.

There are elements of object-oriented programming guiding us, but we're betting on relational databases right now. We're not looking for the pure object-oriented but components of that, thinking like that. Sybase [Inc.], Informix [Software Inc.], Oracle [Corp.] -- these are all companies we are talking to now.

You raise an interesting question -- should we be designing something more purely in object-oriented techniques?Ñand I think it's early for us to predicate a massive transition from our 1975 vintage systems to object-oriented systems. So I believe in making about a three-quarters-of-the-way step to relational databased systems with some object-oriented flavor but not [NeXT Computer Inc.'s] NeXTstep. We're not prepared to bet the massive migration that we face on technologies that are still pretty immature.

TST

: Where are you now?

Barmann: Well, we'll finish the enterprise model in the spring. We have built our first operational prototype called Margin Account Monitor using the HPS CASE tool. The CASE tool, with the help of an Upper CASE suite of tools delivered by Bachman and married through a coupling to HPS, allows the developers to translate the models directly in executable code, for either the mainframe or for distributed processors.

It is on a two-tier -- not a three-tier -- host and PS/2s; it takes a continuous data feed into our margin accounts and gives us intra-day monitoring of those margin accounts that we want to pay most close attention to during movements in the market. It has a powerful, graphical user interface, windows, pull-down menus. It looks like a Mac and it was developed using HPS, the tool which created cooperative processing execution time code that spread across both the host and the PS/2s. That's because UNIX isn't yet available. That "receiver" isn't ready to accept that "pass" yet. But we are still doing proof-of-concept work with UNIX, too, and so we're encouraged. Now we've got an order-entry system, the heart of our business. A component of that is the next major undertaking. It will take more than the rest of this year and it will be built from the models.

And we'll move forward with the CASE tools, the models as the basis, and a growing library of partially assembled, reusable functional components, with their related data, in something like an object form.

TST

: Another component to your existing systems is linkage to external order execution systems, both for fixed-income and equities. How will that look under SAMS?

Barmann: We're still detailing our fixed-income strategy. It'll be a while before the business side begins to define what their strategy is.

So we'll flesh out those models to the point where they're coupled to externally provided services that are likely to be done better by others. So our model doesn't have us build a fixed-income trading capability like Goldman Sachs [& Co.]'s or like Morgan Stanley [& Co.]'s. And that's why we are investigating other trading systems. We are looking to see where the interfaces are, where their systems can be tapped, where our systems can permit an electronic interchange between their information -- which we don't want to duplicate -- and the needs of our customers. We have made no decision about that, yet. We don't need to, but our models are built in expectation that there will be a link-up somewhere and we'd like to make that as clean an interface as possible.

TST

: And the Mayer & Schweitzer [acquired by Schwab last year] MASTERS order-execution system for over-the-counter equities. Will that continue to operate separately?

Barmann: Exactly. That's currently now coupled to us with an electronic link, and our models leave that as it is at the moment. There is no reason over time that our models couldn't be further elaborated to encompass what's done outside. But we want to stay within ourselves, we don't want to bite off more than we can chew.

The challenge that people like me face is that a lot of the early work is hidden from the eventual customer; a lot of it is training our own people in the use of CASE tools; training them in different analyses and design techniques; acquainting them with the prosiac but important aspects of running systems that are distributed physically. For the first couple of years not a lot seems to go on. We're still into that, we have one year of that and 1992 is a little bit more of that. So in 1993, we will unveil the first major order-entry system. It'll be a product of fundamentally different processes.

Now we have to build the bridges between the old system's data and functions and the new system's data and functions. If we were starting a brand new brokerage firm, then we wouldn't have to worry. But we've got to continue to operate for the next five years with two architectures. With the Beta System, which is what we call our current system, and then SAMS. For five years, we're going to be standing astride the two, pulling data back and forth between them before we can extinguish the old one. That's tricky, and it's made infinitely more complicated by a desire to look at the business on a model basis rather than simply re-engineer mass data into new technologies.

We made a commitment to entrust it largely to our current IS organization rather than pay mercenaries who probably could build it faster, such as Andersen Consulting [Inc.], or a number of other systems integrators. But the result is high training requirements, the potential schism in the organization as some people work on Beta and some people work on SAMS. We've tried to mitigate that by saying everyone will have an opportunity to play in the new world. We made a decision at the outset to communicate very broadly and it's calmed some of the anxiety that I think otherwise would be there. But there are still old habits to break -- fear and anxiety, and this ambiguity. There is something very certain about old-time system building. You can see what you were going to make. Here, with models, even to the people who work with them most intimately, there is a level of abstraction and unreality and intangibility to this.

TST

: Why are you going through the headache of re-training rather then hiring?

Barmann: To reward the loyalty and beliefs of those who worked so hard to keep the current system stable and as rich as it is. That's the altruistic and genuinely felt feeling about this firm, tied to the West Coast, maybe, or Chuck Schwab, founder and chairman, still active.

It's also the practical problem with knowledge transfer, which is always a problem when an outsider builds anything. When the system is built, we want the people here who built it, we don't want EDS [Electronic Data Systems Inc.] or someone to hand us this system.

And thirdly, and maybe most importantly, we want the people who build it to have equity in its success. To have a stake in the choice of the components, in the drafting of the migration plans, in the mastering the new methodologies and technologies and a personal and professional stake in the success of its operations.

But with it we buy an elongation of time, the learning curve, dealing with disparate, sometimes trailing, enthusiasms. But we thought that was better.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here