mirror of
https://github.com/beyondx/Notes.git
synced 2026-02-04 10:54:00 +08:00
Add New Notes
This commit is contained in:
150
Zim/Interview/An_Interview_with_Brian_Kernighan.txt
Normal file
150
Zim/Interview/An_Interview_with_Brian_Kernighan.txt
Normal file
@@ -0,0 +1,150 @@
|
||||
Content-Type: text/x-zim-wiki
|
||||
Wiki-Format: zim 0.4
|
||||
Creation-Date: 2011-03-27T20:38:21+08:00
|
||||
|
||||
====== An Interview with Brian Kernighan ======
|
||||
Created Sunday 27 March 2011
|
||||
|
||||
An Interview with Brian Kernighan
|
||||
by Mihai Budiu
|
||||
http://www.cs.cmu.edu/~mihaib/kernighan-interview
|
||||
Spanish translation by Leonardo Boshell
|
||||
July 2000
|
||||
|
||||
|
||||
Introduction
|
||||
During the summer of 1999 I had the chance to be a research intern at Bell Labs, the research arm of Lucent Technologies. I dared then to ask Dennis Ritchie and Brian Kernighan for an autograph on their C Book.
|
||||
{{./kernighan.jpg}} {{./c-book.jpg.gif}}
|
||||
In the summer of 2000 I went again at Bell Labs for a research stage. This time I boldly ventured to ask Brian Kernighan for an interview for the Romanian computer magazine PC Report Romania, for which I am assistant editor. He has very kindly replied:
|
||||
|
||||
Date: Mon, 10 Jul 2000 14:52:15 -0400
|
||||
To: mihaib@research.bell-labs.com
|
||||
From: "Brian Kernighan"
|
||||
Subject: re: odd request
|
||||
|
||||
sure, no problem. i'm probably pretty boring, but since
|
||||
i don't read romanian, you can make things up...
|
||||
|
||||
come by any time; i'm mostly around.
|
||||
|
||||
brian
|
||||
|
||||
The interview has appeared in the August issue of the magazine, in Romanian. However, I reckoned that Mr. Kernighan's opinions may make very interesting reading for an English-speaking audience too, so I decided to also release (with his approval) the interview in English. Here it is; enjoy! BTW: nothing is made up!
|
||||
The Interview
|
||||
M: What is the correct way to pronounce your name? I heard that it is not the obvious way?
|
||||
|
||||
K: It's pronounced Kernihan, the g is silent.
|
||||
|
||||
Brian Kernighan
|
||||
|
||||
M: You chose to work in computer science when this was not such an obvious career choice. Can you tell us about how you made this choice and what you think in retrospect about this choice?
|
||||
|
||||
K: It's true that I started working with computers probably in the mid to late sixties, when things were fairly early on, and it was entirely by accident. I think I saw my first computer in 1963; it was an old IBM 650. I didn't do any serious programming until in 1964 when I was in my last year of college. But it was fun, and it was before computer science was in any sense a field. When I went to graduate school there was a Computer Science program in the Electrical Engineering Department at Princeton. This was fairly typical of a lot of places: computer science was not a separate academic field, it was just part of some department that might have a computer or people interested in computation, so I just backed into it, entirely by accident. This has been a lucky accident, because obviously the field has had a lot of interesting things happen.
|
||||
|
||||
M: You've been in this area for a long time, and you've been a very important player in the evolution of computer science. Some of your work has had a profound impact. Can you point out some things that you consider fundamental advances in computer science in the last 30 years, some changes of paradigm that have happened?
|
||||
|
||||
K: I think that there have been a fair number of changes, not necessarily in ``computer science'', but in computing in general. Obviously the fact that hardware has gotten enormously faster: Moore's law, although it is a simple quantitative change, its exponential growth applied for 30 years makes an enormous change; some piece of that relies on computer science, but not much. At the same time, the thing I am more familiar with, and more interested in technically, is the use of various kinds of programming languages so that we're better able to convey to a machine what we want to do. The growth of languages, of the technology for understanding how to express things for a machine, has had an enormous impact as well. Of course, as machines had gotten more powerful, you could afford to devote more resources and use languages that were not efficient 25 or 30 years ago but which are now usable. Other important changes are algorithmic improvements, which truly belong to computer science; also the idea of NP-completeness, which enables us to think about what's easy and what's hard. But as far as I am concerned, the thing I find most interesting is the growth in programming languages.
|
||||
|
||||
M: Over the years you have worked on many different areas: graph algorithms, software engineering and software tools, digital typography, but most of your research was in programming languages. What are your current research interests?
|
||||
|
||||
K: [laughing] It's interesting, what I've been doing for the last couple of days is a hack-attack back to almost my earliest roots in computerized document preparation or digital typography if you like. I have been working on taking eqn, which is a program Lorinda Cherry and I wrote back in 1974, and converting it to produce output in XML or HTML, so we can put mathematics more easily on web pages. There are a lot of people who have tried various takes on this, but none of them seem to be very good or at least not ready for use yet. We have a local need for it, so I am sitting there and working on it, a C program that I wrote literally more than 25 years ago. It's a very small C program, and I am having great deal of fun trying to fix it up. This is one piece of what I am doing, a very small thing, but it's kind of fun go back and redo something that I've spent time on so many years ago.
|
||||
|
||||
The other thing that I'm working on is a language called AMPL which David Gay and Bob Fourer and I did; AMPL is a language for specifying optimization problems, for setting things up like linear programming problems. We're trying to wrap it up so that it can be used as part of larger processes. We're putting an object-oriented interface on it, so it can be buried inside of some other programs or used as a COM or CORBA object.
|
||||
|
||||
These are the two things that I'm doing at the moment.
|
||||
|
||||
M: Speaking of programming languages, does the eqn program still compile?
|
||||
|
||||
K: Yes, it still compiled, I probably hadn't compiled it for five or ten years and it compiled without any problem whatsoever. It's a very simple and small C program; I probably converted it to ANSI C back in the late '80s. What I've been doing is mostly throwing things away because the output language now is simpler than before.
|
||||
|
||||
M: Most people probably know you because of the C book, so let me ask you a couple of questions about the C language. C indeed has had a very profound influence; what do you think are the most valuable features of the C language?
|
||||
|
||||
The C Book
|
||||
|
||||
K: C is the best balance I've ever seen between power and expressiveness. You can do almost anything you want to do by programming fairly straightforwardly and you will have a very good mental model of what's going to happen on the machine; you can predict reasonably well how quickly it's going to run, you understand what's going on and it gives you complete freedom to do whatever you want. C doesn't put constraints in your way, it doesn't force you into using a particular programming style; on the other hand, it doesn't provide lots and lots of facilities, it doesn't have an enormous library, but in terms of getting something done with not too much effort, I haven't seen anything to this day that I like better. There are other languages that are nice for certain kinds of applications, but if I were stuck on a desert island with only one compiler I'd want a C compiler.
|
||||
|
||||
M: Actually C is also my favorite programming language, and I've written a lot of programs in it, but since I began writing compilers for C, I have to confess I've begun to like it less. Some things are very hard to optimize. Can you tell us about the worse features of C, from your point of view?
|
||||
|
||||
K: I can't comment on the ``worse'', but remember, C is entirely the work of Dennis Ritchie, I am but a popularizer and in particular I cannot say what is easier or hard to compile in C. There are some trivial things that are wrong with C: the switch statement could have been better designed, the precedences of some operators are wrong, but those are trivial things and everybody's learned to live with them. I think that the real problem with C is that it doesn't give you enough mechanisms for structuring really big programs, for creating ``firewalls'' within programs so you can keep the various pieces apart. It's not that you can't do all of these things, that you can't simulate object-oriented programming or other methodology you want in C. You can simulate it, but the compiler, the language itself isn't giving you any help. But considering that this is a language which is almost 30 years old now and was created when machines were tiny compared to what they are today, it's really an amazing piece of work and has stood the test of time extremely well. There's not much in it that I would change.
|
||||
|
||||
Sometimes I do write C++ instead of C. C++ I think is basically too big a language, although there's a reason for almost everything that's in it. When I write a C program of any size, I probably will wind-up using 75, 80, 90% of the language features. In other words, most of the language is useful in almost any kind of program. By contrast, if I write in C++ I probably don't use even 10% of the language, and in fact the other 90% I don't think I understand. In that sense I would argue that C++ is too big, but C++ does give you may of the things that you need to write big programs: it does really make it possible for you to create objects, to protect the internal representation of information so that it presents a nice facade that you can't look behind. C++ has an enormous amount of mechanism that I think is very useful, and that C doesn't give you.
|
||||
|
||||
M: I have a question about research in language design. It's interesting for instance that Java is very much hyped and the community is split among the merits and flaws of the language. The language has indeed acquired some nice features proposed by researchers in the area (like garbage collection), but also the researchers point some of its weaknesses (like the arrays which are covariant and they shouldn't be). There's a whole body of research done in programming languages nowadays, and a very interesting body of research in functional programming languages, but you don't see this research to really influence the real world, i.e. what people are really using on an everyday basis. Instead all sorts of ad-hoc languages pop up like Perl or Python or stuff like that. Where do you see the fault; what's not right?
|
||||
|
||||
K: That is unfortunately a very good question, and there's a certain amount of discussion here at Bell Labs between a very strong group in functional programming languages and a group using very much ad-hoc, pragmatic languages. I honestly don't know why the functional languages don't succeed. For instance ML, which is arguably the best combination, perhaps the one that ought to succeed: in spite of being a very well designed language, thought hard about by a lot of good people over a very long time, embodying an enormous amount of effort of compiler technology, still does not seem to be broadly used. I will oversimplify a lot, and probably offend my friends, by saying that the only thing people do with ML is to make ML compilers. [laughing] I'm overstating intentionally, but it has some of that flavor, and I don't really understand why. I think, speaking only for myself, part of the reason that ML in particular, and functional programming languages in general have not caught on more broadly, is that they're aimed at people who have mathematical sophistication, who are able to think in more abstract ways, that lots of other folks, and I include myself, have trouble with. Whereas languages like C are very operational, you can see how every single piece of them maps into what's going on in the machine in a very very direct sense. If I had been brought up at a different time and in a different environment perhaps I'd be totally comfortable in ML and would find C unsafe, a little dangerous, not very expressive. But my sense is that the functional languages come out of a fairly mathematical community and require a fairly mathematical line of reasoning and therefore are difficult for the people on the street.
|
||||
|
||||
M: So I guess, the suggestion is for the researchers to somehow lower the language level, to promote the good qualities?
|
||||
|
||||
K: I didn't really answer the other part of your question, why research in languages has not had as much effect as it should have. I think actually it has had an effect, in places like parser technology, code generation, no matter what the language is: research had a big effect on building language tools but less on the design of languages per se.
|
||||
|
||||
The languages that succeed are very pragmatic, and are very often fairly dirty because they try to solve real problems. C++ is a great example of a language that in many ways has serious flaws. One of the flaws is that it tried very hard to be compatible with C: compatible at the object level, compatible very closely at the source level. Because of this there are places where there's something ugly in the language, weird syntactic problems, strange semantic behaviors. In one sense this is bad, and nobody should ever do that, but one of the reasons that C++ succeeded was precisely that it was compatible with C, it was able to use the C libraries, it was usable by the base of existing C programmers, and therefore people could back into it and use it fairly effectively without having to buy into a whole new way of doing business. And this is not the case for ML, which was being done at about the same time and, at least partly, in almost the same place, but which took a very different view of the world. As a pragmatic thing, C++ is extremely successful but it paid a certain price by being compatible with the previous language.
|
||||
|
||||
M: So you're an advocate of incremental evolution. I see that you're an author of eight books, all of them co-authored. Does this mean that you have a collaborative research style?
|
||||
|
||||
K: If you're going to write a book it is a heck a lot easier to get someone else to do a lot of the work [laughing]. I have been very fortunate in having very good collaborators on all of these books and in that sense it is just enormously easy. It is easier to do something like a book, which needs six months or a year of work, if you've got somebody else who's also working on it with you. Also it's a sanity check, helping to make sure you don't go too far off in one direction: you've got somebody else steering you back into what they think is the right direction.
|
||||
|
||||
I think everything I've done I've done with somebody else: it's more fun to work with other people than to lock yourself in an office and do it all by yourself. And I think I'm probably better at listening and finding somebody who's got a good idea and then working with that person on the good idea rather than try to invent one of my own.
|
||||
|
||||
M: Speaking of sanity checks, I am working on a project which involves a large code base; some functions which are edited by several people: they constantly change the style of the indentation and identifiers. You have published some books on coding style: does your style always match your co-author's style, or do you have problems in reconciling?
|
||||
|
||||
K: [laughing] That's also a nice question. Occasionally I've had, ``trouble'' is not the word, but with co-authors, there have been discussions about where to put the braces, where to put the spaces and how to spell identifier names. Most of those things have been pretty trivial, partly because my co-authors have been right around here and we have grown up in the same kind of cultural background. But for instance when Rob Pike and I were working on ``The Practice of Programming'' a couple of years ago, we had pretty intense discussions about trivial keywords like ``where do you put the spaces''. How many spaces do you put? I like to put spaces after things like if and while and for and Rob does not. I won that part of the battle, but there was some other part of the battle I lost, I don't even remember now what it was. We definitely didn't agree 100%, but we came to a friendly settlement of the whole thing.
|
||||
|
||||
The more people you have working on something and the bigger the program, the harder it's going to be, and at some point you have to have agreed-upon standards that everybody sticks to and mechanized tools like pretty-printers that just enforce doing it by the rules, because otherwise you lose too much time and there's real chance for making mistakes.
|
||||
|
||||
M: You just mentioned pretty-printers; what other programming tools and programming environments do you favor?
|
||||
|
||||
K: When I have a choice I still do all my programming in Unix. I use Rob Pike's sam editor, I don't use Emacs. When I can't use sam I use vi for historical reasons, and I am still quite comfortable with ed [laughing]; I know, that's even before you guys where born. And it's partly a question of history: I knew Bill Joy when he was working on vi.
|
||||
|
||||
I don't use fancy debuggers, I use print statements and I don't use a debugger for anything more than getting a stack trace when the program dies unexpectedly. When I write code on Windows I use typically the Microsoft development environment: they know where all the files are, and how to get all the include files and the like, and I use them, even though in many respects they don't match the way I want do business. I also use good old-fashioned Unix tools; when I run Windows I typically import something like the mks toolkits and I have standard programs that I'm used to for finding things and comparing them and so on.
|
||||
|
||||
M: I'll shift again the subject. When I came to the U.S. I was very surprised to discover that there's very high quality research and also fundamental research --- research which is not necessarily aimed to a product or to making money --- such research is made not only in universities, but also in a few large companies. What can you tell us about research at Lucent, a large company, which used to be part of AT&T, an even bigger company?
|
||||
|
||||
K: I'll give you the official company line here, although I think that much of it is still true. Research has been a part of this company when it was called ``The Bell System'', AT&T, or Lucent, for a very long time: Bell Labs had its 75th anniversary this year. I think research started as a recognition that there were certain things that people didn't know how to do but had to figure how to do if they were to improve whatever product or service they were going to provide. Of course, in ancient times, that was telephone service; 30 or 40 years ago the telephone technology started to pick up a significant computer component and that brings research in computer science here. I think that the same sort of thing is true for companies like IBM, which runs very effective research labs as well; that's certainly another company that has a very long tradition of supporting the research environment.
|
||||
|
||||
There's the interesting question of ``how does a company justify the money it spends on research''. Lucent at this point has 150,000 employees or so; the research part of it, the part that is you and me, is somewhat less that 1% of that, maybe it's 1000 to 1500 people. The company's annual revenue was 38 billion $ in 1999, so we're spending about 400M$ annually on research to keep you and me sitting in comfortable offices thinking great thoughts. That actually seems like a pretty reasonable way to invest, a high-risk but potentially high-reward part of your assets. You have to be thinking ``where are we going to be a few years from now?'', what kinds of problems now bother us, that we have to get some kind of solution to, which we don't need today; it would be nice if we had it today, but we know that we're going to need it in the future. Unfortunately it's really hard to figure out how to do these things, sometimes even what the right problems are. I think that the best mechanism anybody has found yet is to take a small amount of money, 1% let's say, and hire a bunch of bright people, and put them into an environment where they are encouraged to talk to each other, to talk to the people in the rest of the company, to find out what kinds of problems the people in the rest of the company have; people in the rest of the company are also encouraged to come and say ``can you help with this problem that we have?'', and the hope is that by this almost random process, and it really is in many ways random...
|
||||
|
||||
M: [Interrupting] But not only that; you have research and development in many other companies; at Bell Labs you are also encouraged to publish!
|
||||
|
||||
K: ... I think that the question is ``how does this company differ from other companies in the fact that we publish?''. There're several things that you can see there. One is that the scale is much larger; Lucent Bell Labs may still be the biggest industrial research lab anywhere, doing research of the kind that you would find in the universities in the good old days, essentially undirected research, not focused on products immediately. IBM is at least comparable, and Xerox to some degree, and there are other companies like that. One of the issues is that research here, at least in the computer science group, and in all of the physical sciences, is between hard-core industrial, where they're basically doing research that's strongly focused on product, and academics, where they're mostly doing things because of curiosity or thinking further out. A big industrial lab is stretched between those two: it has more of a focus on things that might be practical, but at the same time it has a finger in the academic world, it has connections to the academic world. And it has to do that, because, among other things, that's the recruiting mechanism: the reason that you're here rather than at Cisco, let's say, is that Cisco doesn't do research; Cisco buys companies. It's not that Cisco is a bad place, Cisco is a wonderful place in many respects, but it does business differently than Lucent. One other thing that we do by playing in the academic world as well as in the industrial development is that we're able to interact with people of universities as equal colleagues, and therefore we can suck in people like you, who are with us for the summer, and perhaps will come back permanently. In this way we get a steady influx of good people. But to do that you have to put something back into the system. We have to let you in, we have to show you all the interesting things, and we have to let you write papers, and we have to write papers ourselves, because otherwise people wouldn't believe that we did anything interesting. So we have to be largely participating members of the scientific or academic community as well as contributing to the welfare or good of the company. That's an interesting and not solved problem, on how to do both of those things and keep from getting too far into one or the other camp.
|
||||
|
||||
M: You've mentioned Rob Pike; you've authored two books with him; I would like to ask a question about a controversial talk he gave, in which he argues that research in computer science systems is basically dead http://cm.bell-labs.com/who/rob/utah2000.pdf (mirror at http://linux.usc.edu/~ehovland/utah2000.pdf). What do you think about this statement?
|
||||
|
||||
K: In fairness, Rob is almost always right, although I wouldn't say that to his face [laughs]. I only looked at the slides of that talk recently, I didn't hear him give it, but I think that in many respects he's right. His observation is that it's hard to do systems work: the scale of things is too large for academic environments sometimes, the reward mechanisms in the academic environments may be wrong. As a result a lot of what happens in real systems work tends to be incremental, performance evaluation rather than synthesizing interesting new combinations. I don't know why that is the case: it may be that it's hard in an academic setting to get proper support, it may be that it takes too long --- Rob's observation is that real systems take five years or more, and that's roughly the duration of graduate studies --- so it's hard to get something that matches the career of a student. I wouldn't say that research in systems is ``dead'', but it's certainly not as alive as it could be.
|
||||
|
||||
M: Speaking of academia, I saw that you have taught at least two classes at Princeton. I would like to ask about your opinion on computer science education, because I heard complaints coming from the industry that undergraduates in computer science classes master too much useless theoretical skills and they don't know enough about real program development.
|
||||
|
||||
K: I've taught four courses at Princeton and Harvard in the last four or five years, at various levels, but that's not enough to qualify me as an ``expert'' in computer science education. Those are two particular schools and I've taught rather screwball things. I don't think universities should be in the business of teaching things that you should learn at a trade school; I don't think it is the role of a university to teach people how to use, let's say, Visual C++ and its Integrated Development Environment. I think the role of the university is to teach students how to program in a particular flavor of language that has for example object-oriented character, to help students understand the issues and trade-offs that go into families of languages, like C, C++ and Java, and how those relate to languages which slice it in a different way, like functional languages. Teaching students skills so that they can step immediately into a Windows development shop and are able to write COM programs is just not right. That's not what universities should be doing; universities should be teaching things which are likely to last, for a lifetime if you're lucky, but at least 5 or 10 or 20 years, and that means principles and ideas. At the same time, they should be illustrating them with the best possible examples taken from current practice.
|
||||
|
||||
At Princeton I taught a junior level course, a combination of software engineering and advanced programming: the students there, at least the seniors in that class, were largely very experienced in the kinds of things that industry probably wants. They were comfortable with Visual C++, they knew how to pick components off the net and glue them together, and they could write Java applications of considerable sophistication. Much of that may have been learned by summer jobs. If industry wants people who have more than a ``useless'' theoretical knowledge [laughs], what it should be doing is making sure it gets these bright kids from school and gives them interesting summer jobs that round out the theoretical ideas and the general insights with specifics of a particular application. People pick up that stuff remarkably fast and if they do interesting things on summer jobs they carry that back into their academic careers. I was pretty impressed by how much the students knew, stuff they had not all learned in class.
|
||||
|
||||
M: Speaking of students, what advice would you give to a computer science student who wants to pursue a research path? Maybe you see some areas to be more rewarding that other, and maybe some areas are not interesting anymore?
|
||||
|
||||
K: Well, don't take my advice on careers [laughs]. Unfortunately I don't think that there is any good advice. The interesting, sorry, I shouldn't be saying ``interesting'' --- the areas that are difficult are only two: one that it's too hard to write programs that work, and the other that it's too hard to use computers. So if you want things to work on, these are two that you could try. Of course, those are fairly general [laughs], there are a lot of special cases that you could play with. If you make any progress at all, on any aspect, then you have an opportunity to go on and pursue the purely academic side of it or alternatively you may go out and try to make your fortune in a start-up. And at this point it looks like a lot of people would rather make their fortune in a start-up than by spending 5 or 6 years getting a Ph.D. Maybe you're just misguided [laughs].... I think unfortunately the best advice you can give somebody is ``do what you think is interesting, do something that you think is fun and worthwhile, because otherwise you won't do it well anyway''. But that's not any real help.
|
||||
|
||||
M: Maybe you can help by being more concrete: can you recommend us some books, computer science books or otherwise, which you think have had a big influence on you?
|
||||
|
||||
K: The only computer science book I read more than once, that I actually pick up every few years and read parts of again, is The Mythical Man-Month by Fred Brooks, a great book. Partly it's very well written and partly the advice in it, even after more than 25 years, is still highly relevant. There are of course details that are different, some things we approach differently because we have more mechanization and more computer horse-power, but there's an enormous amount of good advice in that, so I recommend it highly. That's the only computer science book I can think of that you read for some combination of pleasure and insight.
|
||||
|
||||
There are other books that I reread that are relevant in computing. Books on how to write, write English in my particular case, like ``The Elements of Style'' by Strunk and White. I go back and I reread that every few years as well, because I think the ability to communicate is probably just as important for most people as the ability to sit down and write code. The ability to convey what it is that you're doing is very important.
|
||||
|
||||
There's also a great book How to Lie with Statistics, which you might find useful in your own research [laughter].
|
||||
|
||||
M: I'll change gears again. Unix and C were created at AT&T and were released under a license which at that time was virtually an open-source license, because AT&T had to do that: being a monopoly it had an agreement with the government, as far as I understand, not to make money out of computers. A lot of people credit this very fact, this liberal license, with the popularity and influence that both Unix and C have had. Recently Lucent has released Plan 9 under an open-source license. What do you think about this ``new'' phenomenon of open-source?
|
||||
|
||||
K: I think it's actually a good thing for the most part. The original Unix license was, as you say, largely done the way it was because AT&T was not permitted to be in any business except the telephone business, so they couldn't make serious money on any kind of software. Because of that they were forced into a very sensible decision, which was to give Unix away essentially for free to universities. They did sell it commercially for what amounted to a nuisance fee, but for universities they gave it away and as a result an entire generation of students and their faculty grew up thinking that Unix was the way that you did computing. Unix certainly was much more productive than commercial operating systems which were available at time, and because the source code came along with it, it was easy to see what was going on, and it was easy to make improvements. It built a community of people who were interested in it, motivated by the same things, who were able all to contribute and in that way work themselves up. I think that the current open-source movement has much of the same character. Many of the tools developed in open-source are based on Unix models. Linux is the obvious thing, being, at least on the outside, based on Unix; many of the things that come from the Free Software Foundation are reimplementations of standard tools and languages from Unix. There are of course other projects, arising because of some weird commercial pressure, like Mozilla, the Netscape code, which is now in the public domain, and to which people are contributing as well. I think that the open-source movement is in general a good thing. I am not sure that it will ever replace tailored, professional, rock-solid commercial products sold for profit. But what it might do in a lot of cases, and I think that genuinely it does do in some things like C compilers, is to provide a reference implementation and a standard that's pretty good and that other implementations have to roughly match or why would anybody pay for them? I think that in that sense it's a useful thing. As for Plan 9, I think that's too late, unfortunately. I think Plan 9 was a great idea and it should've been released under an open-source license when it was first done, eight years ago, but our legal guardians would not permit it. I think that they made a grievous mistake. The current open-source license is definitely worth having but it's not clear whether Plan 9, at least as a general-purpose operating system, will have much effect except in a relatively small niche. It has many things going for it which make it valuable in different areas, particularly where you need a small and highly portable operating system, but is it going to take over from Linux? Probably not.
|
||||
|
||||
M: I am getting ready to end on a lighter note, but first I'll ask another deeper question. Interpolating from the evolution of the area of computer science so far, what other great advances do you expect in the near, and I don't dare to ask, far future?
|
||||
|
||||
K: [laughs] If I could predict the future then I would invest more wisely and I wouldn't have to do these low-paid interviews [n.b. the interview is for free]. Geez, you know, unfortunately I am actually so bad at predicting things.... I am gonna guess that in some sense the situation in computing will be be almost the same: we make a lot of progress, we are able to undertake bigger projects, we can build things which are much more interesting and sophisticated than what we could do 10 years ago. If you look at the kinds of things running on the PC in front of us now, they are enormously more powerful, and flexible than they were 10 years ago. But the amount of messy, intricate, awful code that doesn't work very well and that's underneath all of that has also increased enormously. In some sense I guess we'll continue to make progress, but it'll always be kind-of grimy and not-really-done yet. Because people always take on more than they can reasonably handle, they're always overreaching, and they seem never to go back and clean up the stuff that they did before.
|
||||
|
||||
The other thing that I actually worry about is that computers are pervasive, they're just everywhere, in everything you do. You cannot turn around without being involved with something that depends critically on computers, and more and more of those are things that actually matter to us. It doesn't matter if the Sound Recorder on Windows works or not....
|
||||
|
||||
M: It matters now! [N.B. this interview is recorded using a PC and software sound recording]
|
||||
|
||||
K: .... if we lost this [interview], the Hell with it. But for example, when you fly back to Pittsburgh, you really really want the computers that control the avionics in the Airbus 320 to work properly and not make any mistakes. That's just one of the things that depend critically on our ability to write software that works and build hardware that supports it, and I don't think we know how to do that well yet. We're making progress, it's definitely better, the combination of languages and techniques like verification help, but we still have the same problem: as we understand better how to do the small things we take on bigger and more complicated things and so in a sense we're always up to our armpits in alligators.
|
||||
|
||||
M: The last question I have on my list is about your hobbies.
|
||||
|
||||
K: [laughing] Who's got time for hobbies? If I look at what my hobbies are at this point they amount to reading. When I am not fooling around or playing with computers or doing things that are related to work I find myself reading. Usually history books; I don't know why, it's kind of weird, but I like to read history books. Over my lifetime, over the last 20 or 30 years, I've got phases where I've got deep into something for 3, 4, 5 years. I went through a phase in which I tried to learn Japanese, for example; I can tell you, it takes longer than 3 years to learn Japanese! And so that was a failure. When I was a graduate student I spent roughly five years doing karate and I got to a point where I could sort-of survive, but I dropped it, it's no longer part of my life. And I went through a phase being very interested in investments, and that didn't change my life either, so obviously I wasn't very good at it. So the thing that I do at this point mostly is to read a lot.
|
||||
|
||||
M: Thank you very much!
|
||||
Binary file not shown.
|
After Width: | Height: | Size: 183 KiB |
Binary file not shown.
|
After Width: | Height: | Size: 16 KiB |
BIN
Zim/Interview/An_Interview_with_Brian_Kernighan/c-book.jpg.gif
Normal file
BIN
Zim/Interview/An_Interview_with_Brian_Kernighan/c-book.jpg.gif
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 14 KiB |
BIN
Zim/Interview/An_Interview_with_Brian_Kernighan/kernighan.jpg
Normal file
BIN
Zim/Interview/An_Interview_with_Brian_Kernighan/kernighan.jpg
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 21 KiB |
7
Zim/Interview/DMR.txt
Normal file
7
Zim/Interview/DMR.txt
Normal file
@@ -0,0 +1,7 @@
|
||||
Content-Type: text/x-zim-wiki
|
||||
Wiki-Format: zim 0.4
|
||||
Creation-Date: 2011-06-03T10:46:02+08:00
|
||||
|
||||
====== DMR ======
|
||||
Created 星期五 03 六月 2011
|
||||
|
||||
72
Zim/Interview/DMR/Memo.txt
Normal file
72
Zim/Interview/DMR/Memo.txt
Normal file
@@ -0,0 +1,72 @@
|
||||
Content-Type: text/x-zim-wiki
|
||||
Wiki-Format: zim 0.4
|
||||
Creation-Date: 2011-06-03T10:46:15+08:00
|
||||
|
||||
====== Memo ======
|
||||
Created 星期五 03 六月 2011
|
||||
|
||||
**Proposing a Unix Portability Experiment**
|
||||
|
||||
Below is a memo that Steve Johnson and I gave to Elliot Pinson, our boss at the time, proposing the idea of buying a new machine to test the idea of porting Unix to a new architecture. It is, unfortunately, undated. Internal evidence from the document ("around March, 1976") suggests that it must date to ca. the beginning of 1976. The modified-date of the file as I found it is in 1978, but the file was certainly copied or fiddled, because the paper on the outcome of the experiment, "Portability of C Programs and the UNIX System", BSTJ 17 6, July-August 1978, confirms that the proposed machine (an Interdata 8/32) arrived in April 1977.
|
||||
|
||||
Johnson's PCC compiler, which would be used for the Interdata project and would later be used for the VAX (Unix 32V) here and at Berkeley (BSD) and then also by important startups like MIPS, SGI, Sun, was evidently in its own late formative stage at the time of writing.
|
||||
|
||||
This memo, of course, wasn't a surprise to Pinson. He'd been quite supportive of the idea, and the memo follows a familiar style: researchers want to spend money and time on something, management wants reassurance that the researchers have a moderately coherent idea. Writing this was our equivalent of writing a DARPA or NSF grant proposal, but one that was already a lock.
|
||||
|
||||
To explain probably-unfamiliar references: ALTRAN was a language and system for doing symbolic algebra, primarily on polynomials and rational forms; it was mostly the brainchild of W. Stanley Brown. The whole thing was written in hyper-portable Fortran. #3ESS was a not-widely-deployed telephone Central Office for small communities; it used a unique processor (the AP-3). At the time it looked like fun and value for the Bell System for us to move Unix to it, but this didn't pan out.
|
||||
|
||||
|
||||
|
||||
Unix Portability
|
||||
|
||||
S. C. Johnson
|
||||
D. M. Ritchie
|
||||
|
||||
What We Want to Do
|
||||
|
||||
|
||||
We propose a project with three major goals:
|
||||
|
||||
1.
|
||||
to refine and extend the C language to make most C programs portable to a wide variety of machines, mechanically identifying non-portable constructions;
|
||||
2.
|
||||
to write a compiler for C which can be changed without grave difficulty to generate code for a variety of machines;
|
||||
3.
|
||||
to revise or recode a substantial portion of the Unix system in portable C, detecting and isolating machine dependencies, and demonstrate its portability by moving it to another machine.
|
||||
|
||||
|
||||
From pursuing each goal we hope to attain a corresponding benefit:
|
||||
|
||||
1.
|
||||
improved understanding of the proper design of languages which, like C, operate on a level close to that of real machines but which can be made largely machine-independent;
|
||||
2.
|
||||
a C compiler which can be adapted to other machines (independently of Unix), and which puts in practice some recent developments in the theory of code generation;
|
||||
3.
|
||||
a working, if perhaps crude and incomplete, implementation of Unix on at least one other machine, with the hope that other implementations will be fairly straightforward.
|
||||
|
||||
What We Cannot Do
|
||||
|
||||
The common theme here is portability on a bold, apparently unprecedented, scale: we are attempting to make `portable' a multi-access, multi-programming operating system complete with enough utility programs to make it useful as a production tool. It is clear that the degree of portability promised cannot approach that of ALTRAN, for example, which can be brought up with a fortnight of effort by someone skilled in local conditions but ignorant of ALTRAN itself. We do not plan that the C language be bootstrapped by means of a simple interpreter of an intermediate language; instead an acceptably efficient code generator must be written. The compiler will indeed be designed carefully so as to make changes easy, but for each new machine it will inevitably demand considerable skill even to decide on data representations and run-time conventions, let alone code sequences to be produced.
|
||||
|
||||
Likewise, although we hope to isolate the machine dependent portions of the operating system into a set of primitive routines, implementation of these primitives will involve deep knowledge of the most recondite aspects of the target machine, including the details of I/O operations and memory protection and relocation.
|
||||
|
||||
Next, the sheer bulk of code potentially involved is quite great, on the order of 100,000 lines, counting standard Unix utilities written in C but not subroutines or assembly-language routines and interfaces. Even if many of these utilities are dispensible at first, even if they are mostly portable anyway, even if we are able to discover non-portable constructs mechanically and unerringly, there will still be plenty of work in transporting them.
|
||||
|
||||
In view of the intrinsic difficulties of our project, we feel well justified in rejecting a number of sub-goals which might seem otherwise defensible. Thus, we cannot hope to make a portable Unix system compatible with software, file formats, or inadequate character set that may already exist on the machine to which it is moved; to promise to do so would impossibly complicate the project and, in fact, might destroy the usefulness of the result. If Unix is to be installed on a machine, its way of doing business must be accepted as the right way; afterwards, perhaps, other software can be made to work.
|
||||
Where We Are
|
||||
|
||||
Each of the goals we mentioned initially has some characteristic problems associated with it. Since we have devoted careful attention only to the portability of C programs, and have uncovered several difficulties therein, we will discuss that aspect of the project.
|
||||
|
||||
Most of the C difficulties stem from the twin assumptions that pointers and integers require the same space in storage, and that pointers to an object of a given type (for example integers) are usable as pointers to a more finely-divided type (for example characters). The first assumption is stated explicitly in the C definition, and happens to be realistic on the only machines for which C has been completely implemented (PDP-11, H6000, S/370). It is used when two structures, which differ by substitution of an integer for a pointer, are assumed to be congruent; when the number `0' is used as an argument to a function expecting a pointer, to indicate a null value; and when a function delivering a pointer remains undeclared (and thus implicitly integer-valued). This assumption fails on some potentially interesting machines, such as the ESS #3 CC (a.k.a AP-3), which has 16-bit integers and 20-bit pointers.
|
||||
|
||||
The second assumption, that various sorts of pointers are compatible, is visible most dramatically in the current I/O interfaces to the operating system. The `read' call, for example, is equally happy to receive a character pointer as a destination for bytes as it is an integer pointer through which to store integers. This assumption too fails on many machines, including most word-oriented 16-bit machines as well as the Sigma 5.
|
||||
|
||||
If C, like PL/1 and Algol 68, required declarations of functions and of their arguments to be visible at the time of their use, the effects of these assumptions would be largely mitigated because the compiler could insert the appropriate conversions. It remains to be seen whether the burden of inserting such declarations is adequately compensated for. In any event, we have a useful tool (the `lint' program) for discovering inappropriate associations between integers and pointers; it can be modified, if need be, to produce the required declarations automatically.
|
||||
|
||||
Work on the portable C compiler is well under way; it uses the YACC-based `front-end' which has existed for some time. Examination of the operating system proper has not begun.
|
||||
What We Want
|
||||
|
||||
We would like to obtain a new machine, and have easy access to it from the Research Unix machine, around March, 1976; by that time the new compiler should be working well. Two factors will govern the choice of this machine. It should not be so similar to the PDP-11 that the exercise becomes trivial, nor should it be so peculiar that the result is unnecessarily complicated or inefficient. Moreover, if the machine is attractive in its own right, then even the initial implementation of portable Unix may find an immediate market in the Bell System and elsewhere.
|
||||
|
||||
|
||||
Copyright © 1998 Lucent Technologies Inc. All rights reserved.
|
||||
181
Zim/Interview/DMR/Writings_from_the_Past--Dennis_M._Ritchie.txt
Normal file
181
Zim/Interview/DMR/Writings_from_the_Past--Dennis_M._Ritchie.txt
Normal file
@@ -0,0 +1,181 @@
|
||||
Content-Type: text/x-zim-wiki
|
||||
Wiki-Format: zim 0.4
|
||||
Creation-Date: 2011-06-03T09:31:20+08:00
|
||||
|
||||
====== Writings from the Past--Dennis M. Ritchie ======
|
||||
Created 星期五 03 六月 2011
|
||||
http://cm.bell-labs.com/cm/cs/who/dmr/notes.html
|
||||
|
||||
===== Writings from the Past =====
|
||||
|
||||
**Machine-readable** versions of early Unix material are hard to come by, even for us. "Backup" in those days (1969 through the early 70s) consisted of punched cards, paper tapes, or uploading to a Honeywell machine. We no longer have those cards, tapes, or the Honeywell.
|
||||
|
||||
When we got a **PDP-11** around 1971, we did get **DECtape**, and did save some material, though not enough. A few years ago Keith Bostic and Paul Vixie resurrected a PDP-11 DECtape drive and offered to read any old tapes we might have around, and I sent several to them. These notes are among the small treasures discovered there.
|
||||
|
||||
Two files named "**notes1**" and "**notes2**" were on the tape labelled "DMR", and their date, if I correctly interpret the time representation on the tape, is 15 March, 1972.
|
||||
|
||||
I reproduce them below. I have no memory of why I wrote them, but they look very much like something to have in front of me for a talk somewhere, because of the references to slides. From the wording at the end ("the public, i.e. other Labs users"), I gather that it intended to be internal to Bell Labs. HTML markup and the corrections and annotations in [] were added in September 1997, but otherwise it's original.
|
||||
|
||||
**Dennis**
|
||||
|
||||
|
||||
===== notes1 =====
|
||||
|
||||
UNIX is **a general-purpose timesharing operating system** for the Digital Equipment Corp PDP-11.
|
||||
|
||||
It is the only such system known to me; in any case it is probably the first. DEC, in particular, shows no signs of producing a** multi-user** system.
|
||||
|
||||
==== Some history and credits. ====
|
||||
|
||||
UNIX was written by **K. Thompson**. I wrote much of the system software; Ken most of the rest; Other contributors have been Joe Ossanna, Doug McIlroy, and Bob Morris.
|
||||
|
||||
The first UNIX was on a** PDP-7**. Most features were present, some in rudimentary form.
|
||||
|
||||
UNIX-11 system was largely written in **Jan.-Mar., 1971**; since the[n] changes are generally refinements.
|
||||
|
||||
The charter for the project, and the reason the machine was obtained, **was to develop a document editing and formatting system**. The original notion was to **use UNIX as a development tool only, and have the editing system run stand alone**. [This is true as it applies to justifying the purchase of the PDP-11. The earlier work on the PDP-7 just happened in the course of doing research.]
|
||||
|
||||
It turned out, however, that it was quite practical to have the editing system run under UNIX, and this is [how it] operates.
|
||||
|
||||
The editing system is now being used by the Patent Division at Murray Hill to **prepare patent applications**. I undertand that the bulk of the applications are being done by UNIX. UNIX is also being tried out by two typists in the MH typing pool.
|
||||
|
||||
At the start I stated that UNIX was a general-purpose time-sharing system. I imagine the concept is familiar, but I want to bring out a few points.
|
||||
|
||||
**First, it is general purpose. **This means it may, or may not, be suitable **as a basis for various special-purpose applications**. The two things that come to mind are **management of large data bases** and applications requiring very **rapid real time response**.
|
||||
|
||||
For several reasons, not all defensible, very large files of information are not very well handled by the file system proper. I will return to this point.
|
||||
|
||||
In the real-time area, the system has no direct hooks to allow a user-written program to respond very rapidly to an external event. **"very rapidly", here, means in less than about half a second**, which is the approximate time to **swap programs off the disk**. There are thoughts about how to arrange this, but it has not been done.
|
||||
|
||||
Notice, however, that the system does in fact make rapid responses to events, in the sense that it is able to pick up characters from typewriter terminals even when they come only a few milliseconds apart. Thus if the "real-time" requirement is really that if **collecting characters **for a terminal or other machine, that ability is already there; what may be lacking is the possibility of **scheduling a non-system program** which wants to act on those characters within a very short time.
|
||||
|
||||
So far I have been talking about the disadvantages of the general-purpose system. The advantage lies in the ease with which program development can be done. With any reasonable number of users, responses to user's commands is quite fast, and the cycle of edit-translate-execute-debug is speed-limited by **thinking time, not compute time**. For example, it takes about **50** seconds to assemble and install a new UNIX system.
|
||||
|
||||
The time-sharing aspect of the system is also vital to a program development effort, since it means that several people can be using the machine at once.
|
||||
|
||||
Likewise when the system is adapted to deal with a particular application, the fact that **multi-programming** is built in from the start means that the most subtle problems which come up when several things must go on at approximately the same time have already been solved.
|
||||
|
||||
Given that UNIX is** excellent for program development**, but may have limitations as a base upon which to build a more specialized system, the question arises as to the advisibility of writing one's own system using UNIX. I can only say that of the several projects using UNIX, all have in fact decided to use it not only for development but to support their application directly. Often some modifications to the system have proved necessary, however. Typically these relate, though, to support of some device the standard UNIX does not provide for, and to rearranging the management of core memory.
|
||||
|
||||
==== HARDWARE ====
|
||||
|
||||
UNIX is running on at least five PDP-11, no two with the same complement of hardware. The slide shows the minimal complement possible.
|
||||
|
||||
The basic requirements above a PDP11 processor are
|
||||
|
||||
12 K core
|
||||
some kind of disk
|
||||
a clock
|
||||
an EAE (extended arithmetic element, for multiply/divide
|
||||
Some sort of tape, to provide for loading the system software and saving the disk
|
||||
|
||||
Above the minimum, it is of course desirable to have more core, lots of disk, communications interfaces, paper tape reader, and a ROM containing a bootload program.
|
||||
|
||||
A summary of the devices which have been attached to actual UNIX systems includes:
|
||||
|
||||
DC-11 communications interfaces attached by DATAPHONE to 10, 15, or 30 cps ASCII terminals (not DM-11 as yet)
|
||||
RF fixed head disk (256K words)
|
||||
RK moving head disk (1.2M words)
|
||||
RP moving head disk (2314-style, 10 M words)
|
||||
paper tape reader/punch
|
||||
201 DATAPHONE interface
|
||||
ACU on DATAPHONE
|
||||
DECtape
|
||||
Magtape (9-track)
|
||||
card reader
|
||||
line printer
|
||||
|
||||
==== SOFTWARE ====
|
||||
|
||||
There is a good deal of software that goes along with UNIX. It should be pointed out that it was all written using UNIX; none of it comes from DEC or elsewhere.
|
||||
|
||||
It should also be pointed out that almost all of it is being worked on in one way or another. That is while all I will list is usable, there are a number of things which we regard as desirable that are not complete.
|
||||
|
||||
The major pieces of UNIX software are:
|
||||
|
||||
assembler
|
||||
link editor
|
||||
text editor
|
||||
FORTRAN compiler
|
||||
B compiler
|
||||
symbolic debugger
|
||||
text formatting program
|
||||
M6 macro processor
|
||||
TMGL compiler-compiler, the last two contributed by Doug McIlroy
|
||||
command line interpreter
|
||||
many utilities, mostly to deal in one way or another with the file system
|
||||
|
||||
==== SYSTEM ====
|
||||
|
||||
The system proper can be regarded as falling into three parts:** the file system, the process control system, and the rest**.
|
||||
|
||||
I will only talk about the file system in any detail at all.
|
||||
|
||||
Files in UNIX are arranged in **a hierarchical, tree shaped structure.** There are two types of object: **files**, and **directories**. A directory is actually no more than a file, but its contents are controlled by the **system**, and the contents are names of other files. (A directory is sometimes called a catalog in other systems.)
|
||||
|
||||
Thus we have the type of arrangement shown in the slide: there is a** root** directory, which is the "base" of the tree (as usual, the tree is upside-down) which contains files and directories.
|
||||
|
||||
Each of the directories under the root also can contain files and directories, and so on.
|
||||
|
||||
In UNIX, files are named by giving a sequence of directories, separated by slashes, and ending in a file or directory (for example: ...)
|
||||
|
||||
The name of the root is "/", and so it begins the sequence.
|
||||
|
||||
(example...)
|
||||
|
||||
Every user always has a** current directory**, which belongs to him. Files can also be named with respect to the current directory, when the name does not begin with a "/". (examples: ...)
|
||||
|
||||
It is possible for the same file to appear in several different directories, under possibly different names. This feature is called "**linking**" (example).
|
||||
|
||||
It is also possible for the directory hierarchy to **be split** across several devices. Thus the system can store a directory, and all [files] and directories lower than it in the hierarchy, on a device other than the one on which the root is stored.
|
||||
|
||||
In particular, in our own version of the system, there is a directory "**/usr**" which contains **all user's directories**, and which is stored on a relatively large, but slow moving head disk, while the othe files are on the fast but small fixed-head disk.
|
||||
notes2
|
||||
|
||||
One of the most interesting notions in the file system is the **special file**.
|
||||
|
||||
Certain files do not refer to disk files at all, but to** I/O devices**. By conve[n]tion, such special files reside in a **particular directory**, although this is not necessary. When a special file is read or written,** the device it refers to is activated**. For example, all the communications interfaces attached to typewriters have special files associated with them. Thus, provided you have permission, anyone can send a message to another user simply by **writing information onto his typewriter's special file**. There are special files, for example, to refer to the paper tape reader and punch, to the 201 dataphone, the console typewriter, and whatever other devices may be on the system. An effort is made to **make these special files behave exactly the same way that ordinary disk files** behave. This means that programs generally do not need to know whether they are reading or writing on some device or on a disk file.
|
||||
|
||||
The system calls to to I/O are designed to be** very simple** to use as well as efficient. There is no notion corresponding to GEFRC on the Honeywell machines and "access methods" in OS [/360 from IBM] because the direct use of system entries is so straightforward.
|
||||
|
||||
Files are **uniformly regarded as consisting of a stream of bytes**; the system makes** no assumptions as to their contents**. Thus the** structure of files** is controlled solely by the **programs **which read and write them. A file of ASCII text, for example, consists simply of a stream of characters **delimited by the new-line characters.** The notion of physical record is fairly well submerged.
|
||||
|
||||
For example, the system entry to read a file has only** three arguments**: the file which is being read; the location where the information is to be placed; and the number of bytes desired. Likewise the write call need only specify the file under consideration, th[e] location of the information, and the number of characters to write. The system takes care of** splitting the read or written information into physical blocks as required**.
|
||||
|
||||
The I/O calls are also apparently **synchronous**; that is, for example, when something is written, so far as the user is concerned, the writing has already been done. Actually the system itself contains** buffers **which contain the information, so that the physical writing may actually be delayed.
|
||||
|
||||
There is not distinction between **"random" and sequential I/O**. The read and write calls are sequential in that, for example, if you read 100 bytes from a file, the next read call will return bytes starting just after the last one read. It is however possible to move the read pointer around (by means of a "seek" call) so as to r**ead the file in any order**.
|
||||
|
||||
I should say that that it is not always desirable to ignore the fact of **physical record sizes**. Program which reads **one character** at a time from a file is clearly at a disadvantage compared to one which reads many, if only because of system overhead. Thus__ I/O bound programs are well-advised to read and write in multiples of the physical record size__ (which happens to be uniformly 512 bytes). But it is efficiency, not a logical requirement, which dictates this.
|
||||
|
||||
==== PROBLEMS ====
|
||||
|
||||
I mentioned earlier that UNIX was not especially suited to applications involving vast quantities of data. The reason is this: files are limited in size to **64K** bytes. The reason for this is not particularly defensible, but it has to do with the fact that the PDP-11 **word size is 16 bits**.
|
||||
|
||||
There are a couple of ways around this problem. One of them is simply to split one large logical file into several smaller actual files. This approach works for a while. The limitation here comes from the fact that** directories are searched in a linear fashion**. Thus if the are a vast number of files, it can become quite time-consuming tosearch directories to find the files they contain. We have not noticed this to be a problem, so far, it is only a worry.
|
||||
|
||||
Another way around the small file size is to use a disk as a special file. For various reasons, when an entire disk drive is accessed as a special file, the size limitation does not occur. Thus one can set up a program which manages its own data-- in effect is its own, special-purpose file system-- and expect reasonable results.
|
||||
|
||||
This again bears on the general versus special purpose system: it probably is more efficient anyway to do your own data management, provided the extra labor is worth the cost.
|
||||
|
||||
==== PROCESS CONTROL ====
|
||||
|
||||
As I said, the second part of UNIX is that part concerned with process control. __A process in UNIX is simply the execution of a program__. Each user has at least one process working on his behalf: its task is to read his typewriter and interpret what he types as commands to the system to do something. The program associated with this process is called the__ Shell__, and it has many valuable features, including the **redirection of I/O**, so that you can execute programs which ordinarily write, for example, on the typewriter, and arrange that their output go on a file.
|
||||
|
||||
I will not go into any details, except to say that either by use of the Shell, or from within a program, it is possible to create an asynchronously running process executing any program designated.
|
||||
|
||||
==== SUMMARY ====
|
||||
|
||||
If you are interested in using UNIX, there are a number of points about which you should be aware.
|
||||
|
||||
First, having to do with the PDP-11 hardware:
|
||||
|
||||
the PDP-11, although probably more powerful that most people realize, is not a large machine: a PDP-11 can only accommodate 28K 16-bit words of core.
|
||||
|
||||
Moreover, the 11-20 has no hardware protection features: any user can at any time crash the system by executing a program with any of an infinite variety of bugs. This fact is probably most important during program development.
|
||||
|
||||
The PDP11/45 essentially solves both of these problems, in a very cost-effective way-- it is hardly more expensive than an 11/20 when the total system cost is considered. It has hardware segmentation and 256K of core can be attached. Since we will be one of the first to get and 11/45, there will definitely be a UNIX on it very soon after it arrives. (however the date is still uncertain.)
|
||||
|
||||
Perhaps more important is the fact that **UNIX is essentially a two-man operation at present**. Anyone who contemplates a UNIX installation should have available some **fairly sophisticated programming talent **if any modifications planned, as they almost certainly will be. The amount of time that we can spend working on behalf of, or even advising, new UNIX users is limited. Documentation exists, but never seems to be complete.
|
||||
|
||||
There have been rumblings from certain departments about taking over the maintenance of UNIX for the public (i.e., other Labs users) but I cannot promise anything.
|
||||
53
Zim/Interview/Guru_of_the_Unix_gurus.txt
Normal file
53
Zim/Interview/Guru_of_the_Unix_gurus.txt
Normal file
@@ -0,0 +1,53 @@
|
||||
Content-Type: text/x-zim-wiki
|
||||
Wiki-Format: zim 0.4
|
||||
Creation-Date: 2011-06-06T14:52:01+08:00
|
||||
|
||||
====== Guru of the Unix gurus ======
|
||||
Created 星期一 06 六月 2011
|
||||
http://www.salon.com/technology/feature/2000/09/01/rich_stevens/index.html
|
||||
|
||||
Guru of the Unix gurus
|
||||
A year after his death, the programming community still treasures the influence of Rich Stevens.
|
||||
By Rachel Chalmers
|
||||
|
||||
When Andrew Hume presented the Usenix Lifetime Achievement Award in San Diego in June, he managed to say exactly two words -- "Richard Stevens" -- before a standing ovation drowned him out. "I sat next to Richard's family at the presentation," says Tom Christiansen, a well-known figure in the Perl programming community who had known Stevens on and off for years. "It was stunning. I don't know if his family did, but I sure noticed a lot of the audience in tears."
|
||||
|
||||
"Usenix," (a word coined to get around trademark restrictions on the word "Unix") is the Advanced Computing Systems Association. W. Richard Stevens is the author of "TCP/IP Illustrated" and "Unix Network Programming," each of which runs to three volumes, and "Advanced Programming in the Unix Environment." Their influence among Unix users is hard to overstate. Thousands of programmers all over the world consider Stevens a guru and his works essential to their jobs.
|
||||
|
||||
"It blew my mind," says his sister, Claire Stevens. "I knew he wrote those books, but it never made a dent. I had absolutely no idea that all these people knew and were touched by him." Claire and Richard's wife, Sally, accepted the award on Stevens' behalf. Stevens died on Sept. 1, 1999. He was 48 years old.
|
||||
|
||||
His death hit the close-knit Unix community hard. Fiercely intelligent and deeply private, Stevens set an example for everyone in the Unix world. What he didn't know, he determined to find out; what he did know, he strove to pass on to anyone who was interested. A year after his death, memories of one of the Unix community's most beloved experts are still fresh and vital.
|
||||
|
||||
Christiansen's recollection is typical. The two were casual acquaintances from the academic conference circuit. "I remember I was doodling around on the piano, and Richard came over and said, 'I heard you doing that at some other conference and it inspired me to take up the piano again,'" Christiansen says. "On subsequent meetings he would tell me all about his progress. As with everything he talked about that he really loved, his eyes just kind of sparkled."
|
||||
|
||||
With Christiansen, Stevens talked about music. With cryptographer Greg Rose, a fellow pilot, he talked about flying. With Dave Hanson, who sat on the committee to assess his doctoral thesis, Stevens talked about yet another shared passion, skiing. To everyone who knew him, it seemed he cared about the things that mattered most to them.
|
||||
|
||||
"He was a very good listener and he knew something about every subject," says Claire. "He could always contribute something, or at least sound intelligent."
|
||||
|
||||
Yet Stevens was also an extraordinarily private man. Christiansen, Rose and Hanson all knew him for years, yet none felt that they knew him well. "I wouldn't say he was complex, but because of his intelligence he could come across that way," Claire admits.
|
||||
|
||||
He brought the full force of his intellect to bear on his work, and it showed. Stevens' books are regarded as models of the genre. "I remember he shocked me, because I ran into him at some conference and he said, 'I wish I'd brought the 'Perl Cookbook' so I could have had you autograph it for me,'" says Christiansen. "I don't consider my books to be in anywhere near the same league of serious, hard work as his. That he would say to me 'I should have got your autograph' was like Dennis [Ritchie, one of the original authors of Unix] saying it."
|
||||
|
||||
"He gave extremely cogent explanations of what's going on," says Rose. "His books are so much more readable than most computer books. A lot of authors look at the documentation and rewrite it with little or no interpretation. He took the time to understand."
|
||||
|
||||
His perseverance was an artifact of an apparently ferocious curiosity. "When I hit something that I don't understand, I take a detour and learn something new," said Stevens in an interview two years before his death. "This often makes my books late by a few months, but I think accuracy and completeness are essential."
|
||||
|
||||
It may have been Stevens' sense of himself as an outsider that inspired this level of dedication. He and his brother and sister were copper-mining brats, born in Africa to a metallurgical engineer; they divided their childhood among northern Rhodesia (now Zambia), Utah, New Mexico, Washington and South Africa. Stevens didn't start out in computer science at all, but in aerospace. It just happened that he graduated in 1973, when Boeing was laying off thousands of aerospace engineers. Programming, like piano, was a late acquisition that took.
|
||||
|
||||
"I really believe that my background is fundamental to the success of 'Unix Network Programming' and my other books," he said. "That is, I was not one of the developers at Berkeley or AT&T, so the writing of UNP was not a 'memory dump.' Everything that is in the book I had to dig out of somewhere and understand myself."
|
||||
|
||||
In fact, Stevens belonged to the generation of Unix geeks who pored over not-exactly-legal copies of John Lions' legendary commentary on the Unix source code. Lions firmly believed that no one could understand the theory of computer science without concrete examples, ideally embodied in C programs. Stevens seems to have agreed.
|
||||
|
||||
"Certainly his books do that. He tends to walk you through the code," says Hanson, who led the graduate seminars at the University of Arizona where Stevens discovered the Lions books. "It's not like you can read this stuff while you're sitting in front of the television. It's a particular style of learning about software that requires a very heavy investment upfront. But it pays off."
|
||||
|
||||
It's not as easy, or as obvious, as it sounds. Publishing code legibly is hard work, and calls for stubborn authors who insist on the best tools for the job. Stevens greatly admired and strove to emulate Donald Knuth, who wrote "The Art of Computer Programming," and Brian Kernighan, "The C Programming Language," whose books are as beautifully laid out as they are brilliantly written.
|
||||
|
||||
For his own work, he insisted on a text-formatting tool even more venerable and revered than Knuth's Tex. "Rich had told me that he gets to send 'troff,'" says Christiansen in awed tones, referring to a fairly archaic layout program. "He still used all the cool old tools, and he prepared camera-ready copy for the publisher. It's very, very rare that authors aren't forced to use Microsoft Word or unadorned SGML." Controlling his own means of production allowed Stevens to explain difficult concepts visually as well as verbally -- hence the apparently perverse theme of "TCP/IP Illustrated." Visualizing a protocol? It sounds crazy, but it works.
|
||||
|
||||
His books are so good that they have come to symbolize intelligence. In "Wayne's World II," Garth's girlfriend carries a copy of "Unix Network Programming." Stevens discovered this when he took his 13-year-old son to see the film. His son grabbed his arm and said, "Dad, that's your book!"
|
||||
|
||||
"I couldn't believe it," he told programmer Trent Hein. "My book was used to define the ultimate geek, and suddenly my son thinks I'm really cool."
|
||||
|
||||
His son was right.
|
||||
|
||||
|
||||
154
Zim/Interview/Interview_with_Bill_Joy.txt
Normal file
154
Zim/Interview/Interview_with_Bill_Joy.txt
Normal file
@@ -0,0 +1,154 @@
|
||||
Content-Type: text/x-zim-wiki
|
||||
Wiki-Format: zim 0.4
|
||||
Creation-Date: 2011-03-27T21:03:52+08:00
|
||||
|
||||
====== Interview with Bill Joy ======
|
||||
Created Sunday 27 March 2011
|
||||
|
||||
The following interview is taken from the August 1984 issue of Unix Review magazine (which briefly changed its name to Performance Computing, but then dropped out of print and went to an online-only format at www.unixreview.com). Permission has not been given to copy this, so you should only use it for your own personal or scholastic purposes. Do not distribute it without obtaining permission from the publisher of Unix Review.
|
||||
|
||||
Actually, I tried to get permission to distribute this with elvis, but the publisher wasn't sure who owned the copyright. Since they weren't sure it was theirs, they didn't feel they could give me permission. I've decided to make this available on my web site until somebody steps forward to claim it as theirs.
|
||||
Interview with Bill Joy
|
||||
By Jim Joyce
|
||||
|
||||
Bill Joy is one of those rare people who can carry on a rapid-fire technical conversation while coding at the keyboard. His seemingly inexhaustible energy has produced the C shell command interpreter, the vi screen editor and the Berkeley paging kernel, among other accomplishments. UNIX REVIEW sent Jim Joyce to Sun Microsystems, where Joy is Vice President in charge of Research and Development, to capture some of this energy.
|
||||
|
||||
REVIEW: How did vi come about?
|
||||
JOY: It's an interesting story. I came to Berkeley in '75 as a theory student and got involved with Mike Harrison working on general context-free parsing algorithms, so I tried to write the thing in Pascal because Pascal had sets, which Ken Thompson had permitted to be of arbitrary length. The program worked, but it was almost 200 lines long - almost too big for the Pascal system. I talked to Thompson to figure out how I could make the Pascal system handle this program. Chuck Haley and I got involved in trying to make the Pascal system handle it, but the thing was wrong because it was building the entire thing in core. So I got sucked in, got department help, and built some hope of receiving enough support eventually to pay for this program to work under Pascal.
|
||||
|
||||
But while we were doing that, we were sort of hacking around on ed just to add things. Chuck came in late one night and put in open mode - where you can move the cursor on the bottom line of the CRT. Then George Coulouris from Queen Mary College came to Berkeley and brought along this thing called em, which stood for "editor for mortals." It had two error messages instead of one. It had a prompt, and its own strange version of open modes done for ITT terminals, which really didn't work very well on ADM-3As.
|
||||
|
||||
So Chuck and I looked at that and we hacked on em for a while, and eventually we ripped the stuff out of em and put some of it into what was then called en, which was really ed with some em features. Chuck would come in at night - we really didn't work exactly the same hours although we overlapped in the afternoon. I'd break the editor and he'd fix it and then he'd break it and I'd fix it. I got really big into writing manual pages, so I wrote manual pages for all the great features we were going to do but never implemented.
|
||||
|
||||
Eventually Chuck graduated with his Ph.D. for his part of the Pascal system. After he left, there was ex Version 0.1 at the Computer Center. There was a version of the editor from EP016, which stood for September 1, '76, the date that the binary was created - after which we promptly lost the source because we were making so many changes and didn't have SCCS.
|
||||
|
||||
Really, what started it all was that we got some ADM-3As to do screen editing. I remember right after Carter got elected, I was sitting in my apartment in Albany, CA, on a Saturday listening to people call Carter and ask stupid questions while I designed the screen editor. That dates it: it was probably '76. It was really a consequence of our initial frustration with Pascal. It went on from there. I stopped working on it whenever they made the reference cards - '78 - '79 - and I really haven't worked on it for five years.
|
||||
|
||||
Mike Horton brought his editor along from Bell Labs called hed for "Horton's editor." He was disappointed when vi won out over it. But vi had momentum with the local users - and Mark, somewhat out of frustration, went out and actually supported vi. That was nice, because I didn't have the patience to do it anymore. Just putting the termcap entries in that people would mail me would take hours a week, and I was tired after three or four years.
|
||||
|
||||
REVIEW: Didn't Bruce Englar implement the count fields feature?
|
||||
JOY: Bruce suggested that. At one point there was an acknowledgment section in the documentation for the editor that mentioned all the people who had helped - I don't know if it's still there in Volume 2.
|
||||
|
||||
A lot of the ideas for the screen editing mode were stolen from a Bravo manual I surreptitiously looked at and copied. Dot is really the double-escape from Bravo, the redo command. Most of the stuff was stolen. There were some things stolen from ed - we got a manual page for the Toronto version of ed, which I think Rob Pike had something to do with. We took some of the regular expression extensions out of that.
|
||||
|
||||
REVIEW: What would you do differently?
|
||||
JOY: I wish we hadn't used all the keys on the keyboard. I think one of the interesting things is that vi is really a mode-based editor. I think as mode-based editors go, it pretty good. One of the good things about EMACS, though, is its programmability and the modelessness. Those are two ideas which never occurred to me. I also wasn't very good at optimizing code when I wrote vi. I think the redisplay module of the editor is almost intractable. It does a really good job for what it does, but when you're writing programs as you're learning... That's why I stopped working on it.
|
||||
|
||||
What actually happened was that I was in the process of adding multiwindows to vi when we installed our VAX, which would have been in December of '78. We didn't have any backups and the tape drive broke. I continued to work even without being able to do backups. And then the source code got scrunched and I didn't have a complete listing. I had almost rewritten all of the display code for windows, and that was when I gave up. After that, I went back to the previous version and just documented the code, finished the manual and closed it off. If that scrunch had not happened, vi would have multiple windows, and I might have put in some programmability - but I don't know.
|
||||
|
||||
The fundamental problem with vi is that it doesn't have a mouse and therefore you've got all these commands. In some sense, its backwards from the kind of thing you'd get from a mouse-oriented thing. I think multiple levels of undo would be wonderful, too. But fundamentally, vi is still ed inside. You can't really fool it.
|
||||
|
||||
Its like one of those pinatas - things that have candy inside but has layer after layer of paper mache on top. It doesn't really have a unified concept. I think if I were going to go back - I wouldn't go back, but start over again.
|
||||
|
||||
I think the wonderful thing about vi is that it has such a good market share because we gave it away. Everybody has it now. So it actually had a chance to become part of what is perceived as basic UNIX. EMACS is a nice editor too, but because it costs hundreds of dollars, there will always be people who won't buy it.
|
||||
|
||||
REVIEW: How do you feel about vi being included in System V?
|
||||
JOY: I was surprised that they didn't do it for so long. I think it killed the performance on a lot of the systems in the Labs for years because everyone had their own copy of it, but it wasn't being shared, and so they wasted huge amounts of memory back when memory was expensive. With 92 people in the Labs maintaining vi independently, I think they ultimately wasted incredible amounts of money. I was surprised about vi going in, though, I didn't know it was in System V. I learned about it being in System V quite a while after it had come out. They had this editor, se, but I guess it failed.
|
||||
|
||||
I think editors have to come out of a certain kind of community. You need a cultural context. As you mentioned before, Bruce Englar thought of a number of things, Dick Fateman contributed work to the cursor position after the join command - little things like that. There were just dozens of people involved, but if you are in an environment where management says, "This person shall do an editor," it doesn't necessarily work. It's funny, the politics at Bell Labs.
|
||||
|
||||
REVIEW: You had said, when you were giving a demonstration earlier today, that when you are on foreign systems you use ed.
|
||||
JOY: That's right. Absolutely.
|
||||
|
||||
REVIEW: You don't even try to use vi?
|
||||
JOY: I'm used to having a 24-line terminal with no ability to scroll back. The reason I use ed is that I don't want to lose what's on the screen. I used to have a Concept terminal which had eight pages of memory, like a mini-version of a window system. I just don't like to lose what's in the window. I'm looking forward to the editor that's going to be embedded in the window system Warren Teitelman is working on. Having editing functionality everywhere would be great in the same sense that it would be nice to have history everywhere.
|
||||
|
||||
REVIEW: So there will be a history mechanism in the new editor?
|
||||
JOY: I would be surprised if there wasn't. Warren basically invented all those things. He's very keen on that. I tried to use EMACS and I liked it. The problem was I spent all my time programming it because it was improving so fast that my programs kept breaking. I got tired of maintaining my macros so I guess I'm looking forward to an editor I can learn and then forget about.
|
||||
|
||||
I started to write a new editor not too long ago and had it about half done after two days. It was going to have almost no commands, but, instead use what's basically the smalltalk editing menu, a scroll bar, and a thumb bar. Lines just went off the right and if your window wasn't big enough - too bad - it just threw them away. There was going to be an edit file, and a store and read file. That was it.
|
||||
|
||||
It was called be. I'll let you guess what that stands for. It actually stands for about eight different things.
|
||||
|
||||
REVIEW: Bill's editor?
|
||||
JOY: That's one of the eight. It's also the English verb "to be" because it is. There are six more. I got tired of people complaining that it was too hard to use UNIX because the editor was too complicated. Since I sort of invented the editor that was most complicated, I thought I would compensate by also designing the editor that was most simple. But I got distracted. If I had just spent another day on it... I could actually edit a file on it. I actually used it to edit itself and scrunched the source code - sort of old home day, because we used to do that all the time.
|
||||
|
||||
I had threatened to remove all the copies of vi on January 1 of this year and force people to use be. I don't think it would have worked, though, because I don't know any of the root passwords here anymore. These editors tend to last too long - almost a decade for vi now. Ideas aren't advancing very quickly, are they?
|
||||
|
||||
REVIEW: So you use Interleaf now?
|
||||
JOY: I use Interleaf for all my documentation. When I'm writing programs, I can type them in half the time with cat because the programs are six lines - a #include, main and a for loop and one or two variables. I actually use vi for editing programs. James Gosling did a really nice editor as part of a project at Carnegie Mellon University which is AWYSIWYG: Almost What You See Is What You Get. It's also a program editor built into the window system he's working on. I think that will ultimately replace vi.
|
||||
|
||||
Interleaf is very nice. I expect there to be a lot of competition for programs like that. I don't expect that to be the only one. By the end of next year there will be half a dozen UNIX-based integrated office systems. Interleaf is based on the formatting process.
|
||||
|
||||
I think you'll see others focused on database, calendar management, mail, and spreadsheets - you need all those things to have a generic office automation application. I don't really know who is going to win. I know about a few that are unannounced, but it's not clear which is the mode desirable. None of them are open, really. None are as programmable as UNIX. You really can't go in and add things that you need. That lack of programmability is probably what ultimately will doom vi. It can't extend its domain.
|
||||
|
||||
REVIEW: Some would argue that vi's domain is already far too extended.
|
||||
JOY: That's probably fair, too. That's why it's so complicated, and has left and right parentheses commands. You start out with a clean concept and then sort of accrete guano. It lands on you and sticks and you can't do anything about it really.
|
||||
|
||||
REVIEW: What is it that Interleaf offers you that EMACS doesn't?
|
||||
JOY: I can just look at my screen, and when I print it off, it's the same as it looks on the screen. It is formatted, and I'm tired of using vi. I get really bored. There have been many nights when I've fallen asleep at the keyboard trying to make a release. At least now I can fall asleep with a mouse in my hand. I use the Xerox optica mouse instead of the other one because it is color coordinated with my office. Did you notice? I prefer the white mouse to the black mouse. You've got to have some fun, right?
|
||||
|
||||
This business of using the same editor for 10 years - it's like living in the same place for 10 years. None of us does it. Everyone moves once a year, right?
|
||||
|
||||
REVIEW: What about Documenter's Workbench and Writer's Workbench?
|
||||
JOY: I used to use diction. I wrote some papers for some conferences and used diction on them. But with Interleaf I don't even have a spell program.
|
||||
|
||||
REVIEW: Why?
|
||||
JOY: Don't need one. Well, I guess I do. I could use one. It just doesn't have spell.
|
||||
|
||||
REVIEW: You don't use spell?
|
||||
JOY: I don't spell things wrong. Except t-h-i-e-r. But no, I don't generally have trouble with spelling mistakes. What spell did for me was catch errors introduced by the substitute commands. With sentences in ed or something, you only see one line, and substitutes can be done wrong. With spell, you can catch fuzzballs that show up in your document. But diction is funny. I didn't like reading documents when diction got done with them.
|
||||
|
||||
REVIEW: Did you use suggest?
|
||||
JOY: Yes, I've tried some of those things. I don't like reading things that have been heavily dictionized because they don't flow. I would like to have an expert system that would help me but I don't think those (style and diction) are close enough. I don't need double or spell either. I don't think any of those can help me write better or be better organized. I think an editor with a hierarchical sort of structure where I could look at the section outlines or make annotations in the margin would be helpful. Post-it notes are perhaps the greatest technological thing in the last 10 years. An electronic analog of the post-it would be wonderful so you could scribble on the document. I find much more of a need to just doodle on the screen than to run these programs. I think some of these tools are overkill. Writers Workbench is fine if you're stuck with troff and nroff.
|
||||
|
||||
I've never used pic. Some people have done great stuff with it, but it is too bad that instead of allowing you to think pictorially and draw pictorially, it forces you to translate images back into words and then compile back. That seems like the Linda Ronstadt song, "You're Taking the Long Way Around."
|
||||
|
||||
REVIEW: You've already mentioned the mouse. What other hardware do you see for the documenter to make things better?
|
||||
JOY: I think the Macintosh proves that everyone can have a bitmapped display. The fundamental tension in UNIX that I think AT&T doesn't understand is that everyone is going to have a bitmap. Bitmap display is media compatible with dot matrix or laser printers. With a mouse to point with, you've got sort of a baseline of facilities around which you can build a document environment. I think you also need a full-page display. I think we'll have to wait for Big Mac from Apple, maybe two years away, to get full-page display. I think a lot of the implications for developers is that this kind of development has to come from the low end, the Volkswagen of the document industry.
|
||||
|
||||
Document preparation systems will also require large screen displays. Something like the Sun is what I think you need - a 19-inch screen where you can see a full page and be able to put up screens and menus with something that's fast enough to allow you to scroll at a reasonable rate. We don't know how to do that without a mouse, really. All of the good research was been done using a pointing device.
|
||||
|
||||
Touch finger, joystick, voice input are all either too late or too early.
|
||||
|
||||
REVIEW: Voice is too early and touch is too late?
|
||||
JOY: I'm not sure voice yet works. I can't talk clearly enough. There was an editorial in Datamation about why the UNIX user interface is horrible. It was pretty poor, but the author does have some good things to say. I think he says something about people buying stoves. If you look at stoves and the way knobs are arranged. You'll see why it is that when you walk up to a random stove you can't tell which knob is going to turn on which burner. It's really stupid. There is no sensible way to put the knobs on the front to tell you. Some stoves have the knob in the center right next to the burners. That makes a lot more sense.
|
||||
|
||||
The point is that you want to have a system that is responsive. You don't want a car that talks to you. I'll never buy a car that says, "Good morning." The neat thing about UNIX is that it is very responsive. You just say, "A pipe to B" - it doesn't blather at you that "execution begins," or "execution terminated, IEFBR14."
|
||||
|
||||
The trouble is that UNIX is not accessible, not transparent in the way that Interleaf is, where you sit down and start poking around in the menu and explore the whole system. Someone I know sat down with a Macintosh and a Lisa and was disappointed because, in a half hour, he explored the whole system and there wasn't as much as he thought. That's true, but the point is in half an hour, almost without a manual you can know which button to push and you can find nearly everything. Things don't get lost. I think that's the key.
|
||||
|
||||
Systems are going to get a lot more sophisticated. Things will tend to get lost unless the interfaces are done in the Macintosh style. People who use these machines may run applications but won't necessarily be skilled at putting applications together. A lot of these people won't even have access to the underlying UNIX system.
|
||||
|
||||
The fundamental tension in System V is that it is oriented toward a character-mapped environment. The software you have to build is completely different. You don't assume a mouse and you don't assume a reasonable-sized display. You just forget it. Those are two different problems.
|
||||
|
||||
It's ultimately the media and the set of peripherals that you have on your machine that affects what the user sees. I don't think the Macintosh software is of any value. I'm not even sure it can be taken to a larger machine. You can spend your time making software small, or you can spend time making it functional and sensible. You can't do both. I think there is an ax that is going to chop the two apart.
|
||||
|
||||
You'll see WordStar, the database sort of word processing environment that doesn't have bitmaps, and you'll see the ones that do, and the difference between the two will be like night and day... The Macintosh's days are numbered. Non-bitmap machines have no future. Personal laser printers will see to that. The days of non-raster stuff are numbered, though sheer momentum will carry it to the end of the decade. These things come and go.
|
||||
|
||||
We went from printing terminals to dumb CRTs to smart CRTs, with tangents off toward storage display tube displays and black and white bitmaps. I think the days of even black and white bitmaps are very numbered. Color will take care of that. And then, with the demise of the last vacuum tube, which is the CRT, and with the advent of thin film transistors, which will be flat displays, it will all be color.
|
||||
|
||||
Black and white bitmaps supplanting CRTs make for a small wavelet, but if you don't see that little wavelet, you're really going to get hit by the tsunami that is to come.
|
||||
|
||||
I've wiped troff off my machine, and I'd rather live in the bitmap world than in the spell/diction world. I want to get mud in my face and arrows in my back with the bitmap.
|
||||
|
||||
REVIEW: The basic UNIX tools that can be used for documentation are plentiful but misunderstood. For example, the use of make to do document control. What are your views on that?
|
||||
JOY: I think make is the program that causes people to write the things down that formerly were scribbled on the wall. It's sort of the graffiti recorder. That's the wonderful thing about it. People don't use SCCS and make enough. The people here doing documentation now use SCCS, mostly because I put all the documentation under SCCS and sort of twisted people's arms into using it.
|
||||
|
||||
REVIEW: Real programmers use cat as their editor.
|
||||
JOY: That's right! There you go! It is too much trouble to say ed, because cat's smaller and only needs two pages of memory - plus you're not likely to get swapped out. That's why ed didn't prompt, you know. The performance of the system was just horrible. It would swap things out randomly and do all sorts of things. In ed you might type "a", but have no idea how far behind the system was. And it didn't matter, and long as it didn't get more that a few hundred characters behind and start throwing lines away without telling you.
|
||||
|
||||
Typically it wasn't that bad. If it had been prompting, you would have hit carriage return and wait for the prompt, and it would have taken three seconds to comment. That's something we noticed when we put em up. We put in the prompt and suddenly realized it had to go through the operating system.
|
||||
|
||||
I think UNIX has lived with grace for years. We've had the grace of people not being able to tell when the system was doing a bad job of scheduling the CPU. Now we can't hide behind time-sharing.
|
||||
|
||||
I think SCCS is misunderstood. I think make has never had a good document. Henry McGilton just finished rewriting a troff manual for the first time. Since troff has never really had a manual, he had to sit down and figure out what some of these things meant - backslash, right adjusting tab stops. No one ever really wrote a good manual for it - partially because Joe Osanna, who wrote troff, died in 1976. The program was written in assembly language, then translated line for line in C and it's all done with global variables - it's an ancient program. It's basically an accretion of all this completely unrelated stuff on top of a very, very small base. It's not surprising that people don't understand it.
|
||||
|
||||
When you look at the manual, tbl looks really good but you sort of get it right by iterative approximation; it's very difficult to get a good-looking table. I think the thing that's really missing is that none of these things help you with graphics or graphic design very much. I want a program that helps me learn how to draw and learn how to paint. Some attempt is made at that by pic but it's solving it in the wrong domain. I don't want to type "arc from A to B." I wouldn't mind saying that, though. Maybe that's the answer: talk to the program.
|
||||
|
||||
I think the hard thing about all these tools is that it takes a fair amount of effort to become proficient. Most people are lazy. I'm lazy. I'm enjoying using other people's software now. At Berkeley for so long, all the software we were using was stuff I had something to do with and that wasn't fun. I have fun with Interleaf because if it crashes, I don't feel responsible. I've even divested myself of responsibility for the operating system. I don't have to worry about that crashing. Editing without guilt.
|
||||
|
||||
REVIEW: All the directions you were talking about really assume a lot more compute power than we have at present.
|
||||
JOY: I think that to make that assumption is bad. All projections that I see have memory going to $300 a megabyte by 1989. Soon the processor will be $50, and you'll be able to use it to refresh video. There are too many good things you'll be able to do with this stuff for it not to be available cheap. The real cost is very low. One has to wonder what software is going to be worth, too. It's going to be produced in such enormous volume.
|
||||
|
||||
When can that stuff go portable? You don't really want to have a telephone in the office or be tied to an office. You'd like to have the office with you and the phone with you. I want to be able to turn the phone off, thank you. I think that's going to require very different technology.
|
||||
|
||||
REVIEW: You mention everything but disks.
|
||||
JOY: You might want to page over satellite telephone... Page fault, and the computer makes a phone call. Direct broadcast or audio disk - that's the technology to do that. It's half a gigabyte - and you get 100 kilobyte data rate or a megabyte or something. I don't remember. You can then carry around with you all the software you need. You can get random data through some communications link. It is very like Dick Tracy. Have you seen these digital pagers? You can really communicate digital information on a portable.
|
||||
|
||||
I don't think you need to have a disk with you. There are so many people who believe that you need to have a disk that you'll be able to have one because they'll make it cheap. That's the way things work. It's not what's possible but what people believe is possible. That's what makes imagination so wonderful, right? Silicon is such an incredible amplifier. If you can figure out what should be and you get people to believe you enough that they will give you money, you can almost make it come true. That's why bubble memory never made it. People didn't believe it was the right thing to do. But there's nothing wrong with bubble memory.
|
||||
|
||||
There's an incredible amount of momentum in the technology. Look at the momentum in vacuum tubes. It's all an economy of scale. There's an incredible momentum in UNIX. It really doesn't matter what UNIX is anymore. It ceased to matter when the vendors started adopting it. People used to call me up saying, "I don't know what UNIX is but I've got to have it! How do I get it?" It's at that point now.
|
||||
|
||||
REVIEW: Like designer jeans?
|
||||
JOY: I don't buy designer jeans - well, what I'm wearing aren't bad jeans. They're my burlap sacks. Wrinkles are in, you know.
|
||||
74
Zim/Interview/Interview_with_Dennis_M._Ritchie.txt
Normal file
74
Zim/Interview/Interview_with_Dennis_M._Ritchie.txt
Normal file
@@ -0,0 +1,74 @@
|
||||
Content-Type: text/x-zim-wiki
|
||||
Wiki-Format: zim 0.4
|
||||
Creation-Date: 2011-06-02T18:29:38+08:00
|
||||
|
||||
====== Interview with Dennis M. Ritchie ======
|
||||
Created 星期四 02 六月 2011
|
||||
http://www.linuxfocus.org/English/July1999/article79.html
|
||||
This is an interview with Dennis M. Ritchie. The man who was one of the developers of C and the Unix OS.
|
||||
|
||||
|
||||
Who is Dennis M. Ritchie?
|
||||
|
||||
Some people become important because they change history, others make history. Dennis Ritchie belongs to the second group of people. When most of us were still learning to walk, he developed the "C" language, the most used programming language. It is not necessary to stress the relevance of this contribution to the mankind.
|
||||
|
||||
But it was not enough for him. Dennis Ritchie and Ken Thompson developed the Unix operating system, i.e. The Operating System. Yes, He created UNIX.
|
||||
|
||||
He has not stop working on computers and operating systems, and as a result, Plan 9 and Inferno were developed by the group of researchers under his leadership.
|
||||
|
||||
His work has been recognized by numerous computer organizations: ACM award for the outstanding paper of 1974 in systems and languages; IEEE Emmanuel Piore Award (1982), Bell Laboratories Fellow (1983); Association for Computing Machinery Turing Award (1983); C&C Foundation award of NEC (1989): IEEE Hamming Medal (1990), etc.
|
||||
|
||||
Currently, Dennis M. Ritchie is head of the System Software Research department in the Computing Science Research Center of Bell Labs/Lucent Technologies in Murray Hill, NJ.
|
||||
|
||||
|
||||
Interview part
|
||||
|
||||
LF: The same way many children want to be Superman, you are the idol of many C programmers and UNIX fans (among others) over the globe. How does it feel being adored by thousand of UNIX and C programmers? It's completely impossible to imagine ourselves today without UNIX nor C. When you created C and began to work on UNIX, did you expect it would be 'THE FUTURE' of Computer Science?
|
||||
|
||||
Dennis: These two questions are much the same, and ones that are often asked. Obviously the rewards and appreciation that I and my colleagues have received are very pleasant, and we do feel that we have helped create something of real value. But no, we did not really expect that this would be "the future" or even anticipate the eventual influence of the work. It was taken up in the spirit of "let's build something useful" and in the meantime do the work needed to help others take part. It's important to keep in mind that although the whole Unix and C or C++ segment is significant, the world of computer science and technology and real products is much larger. This is true both in the academic direction of the study of programming languages, and in the big-money area of software.
|
||||
|
||||
LF: If UNIX is the present and the past of the Operating Systems, C could be considered without doubt 'THE LANGUAGE', despite all the Object-Oriented languages that have appeared in the last years. How do you see C++ and Java, and the often flame wars between C and C++ programmers?
|
||||
C has been characterized (both admiringly and invidiously) as a portable assembly language, and C++ tries to lift its level to object orientation and a more abstract approach to programming.
|
||||
|
||||
Dennis: C++ benefited enormously from C, because C had a fairly large acceptance even before the growth of C++, and C++ could use C both as a base to build a new language and as a tool to create its compilers. C has been characterized (both admiringly and invidiously) as a portable assembly language, and C++ tries to lift its level to object orientation and a more abstract approach to programming. The faults of both (in recently emerging standards) seem to be excessive ornamentation and accumulation of gadgetry. They both have a certain spirit of pragmatism, of trying to understand what's really needed. Java is manifestly a C++ descendant, at once cutting away some of the C legacy having to do with pointers, and adding the idea (not so new, but maybe now really feasible) of machine-independent object files. Now that it's been caught up in machinations between Sun and Microsoft (and also has its own problems with ornamentation) it's hard to guess where things will go.
|
||||
|
||||
LF: Now a hypothetical question: From todays perspectives and after so many years of C experience, Is there anything different you would have done if you had to design C from scratch?
|
||||
|
||||
Dennis: Finding a way (given the limitations of the time) to insist on what has been in the ANSI/ISO standard for some time: complete declaration of the types of function arguments, what the 1989 C standard calls function prototypes. There are many smaller details that are more messy than they should have been. For example, the meaning of the word "static", which is confusingly used for several purposes. I'm still uncertain about the language declaration syntax, where in declarations, syntax is used that mimics the use of the variables being declared. It is one of the things that draws strong criticism, but it has a certain logic to it.
|
||||
|
||||
LF: While C is a well established and completely defined language, operating systems are still very much in evolution. New ideas come as hardware gets faster and cheaper. What are the future key issues that will be at the basis of OS design? In particular, what is your opinion concerning micro-nano-kernels versus monolithic designs?
|
||||
|
||||
Dennis: I don't think this is really an interesting issue, framed that way. I do strongly prefer environments for applications that provide a structured, common name-space and mechanisms for accessing resources, along the lines of Unix (I include Linux here), Plan 9, Inferno. It looks to me that the idea of micro- or nano-kernels didn't really become important in real use, at least as the basis for general-purpose systems. In practice, what seems to happen is that the micro-kernel becomes specialized to the macro-system on top of it. It might remain a useful tool for internal structuring of a system, but doesn't really live on its own. Of course (the world being complicated) there are cases where very simple operating systems are useful for small, appliance devices that aren't intended for general-purpose use, whether desktop or machine room.
|
||||
|
||||
LF:UNIX is by now an operating system with a long history. It was also created many years ago and since then the capabilities and requirements of networks, hardware, services and applications have evolved enormously. What are the current limitations or handicaps of UNIX in face of present and near future user demands?
|
||||
|
||||
Dennis: I don't see any fundamental, technological ones, in terms of the basic system API ("system calls"). There is of course an enormous commercial/political issue in terms of jousting between the Unix commercial vendors and now between the various "free" Unix suppliers, including Linux and *BSD.
|
||||
|
||||
LF: Recently there is a great deal of concern with the approaching year 2000 and the potential for a melt down of the Internet due to the infamous Y2K bug. Do you believe there is any foundation in the apocalyptic predictions made by some experts?
|
||||
|
||||
Dennis: No intelligent comment on this, really. I will not be flying at 23:59 31 December 1999, but since I have not been near an airplane at New Year at any time in my life, this fact probably has little to do with Y2K.
|
||||
|
||||
LF: This wouldn't be a complete interview if we don't mention Inferno, the operating system you are currently working on. What were the main reasons to design a totally new operating system, together with Limbo, its own programming language? Also why Inferno/Limbo if there is JavaVM/Java? In other words, What Inferno has to offer that Java lacks?
|
||||
|
||||
Dennis: The Inferno work was the brainchild of Phil Winterbottom and Rob Pike, and it started just before the Java bandwagon (publicity machine) emerged. Java did have its own predecessor (internally called Oak), but at the time Inferno was hatched there was not yet any reason to think that the phenomenon would emerge, and although we became aware of Java, it was still somewhat unformed. I think it's one of those odd convergences that a venerable technology idea (a language implemented by a portable virtual machine) was revived both by Sun and by us. That said, the Inferno idea was from the start more interesting in terms of OS technology (a language and an OS that would work both on raw, and very cheap, minimal hardware, and also identically as an application under Windows or Unix or Linux). At the same time one has to give Sun credit for hooking better into the vastly explosive WWW/browser market.
|
||||
|
||||
LF: It seems to us that the future of Limbo as a programming language is tied to the expansion and popularity of Inferno as an operating system. Does it make sense to port Limbo to other operating systems? Or is its design and objectives too dependent on Inferno?
|
||||
|
||||
Dennis: Technologically, Limbo is not particularly dependent on Inferno. Realistically, it is indeed dependent, simply because a new language depends on an environment in which it is used.
|
||||
|
||||
LF: Taking a look to your career at Bell Labs, it seems that you have worked at every moment in the projects you really liked, and I presume this is also true with Inferno. Am I wrong asserting that you really enjoyed your work with UNIX and C design?
|
||||
|
||||
Dennis: I have indeed enjoyed my career at Bell Labs (which continues).
|
||||
|
||||
LF: I cannot avoid making a comparison between you and all the people that is currently working on non-profit projects for free, just because they like it - although I am sure they wouldn't refuse money for the work they do for free. Can you see yourself involved in projects like Linux, or similar, if you were not at Bell Labs? How do you see all these people from inside an innovative research lab with many years of experience on your shoulders? Since our magazine is mainly for Linux users we cannot forget to ask you a questions about Linux. First of all, what is your opinion about all the Linux momentum, and the decision of many companies to start developing software for it (Bell Labs, for example: Inferno has its own port to Linux)?
|
||||
|
||||
Dennis: Let me put these questions together. I think the Linux phenomenon is quite delightful, because it draws so strongly on the basis that Unix provided. Linux seems to be the among the healthiest of the direct Unix derivatives, though there are also the various BSD systems as well as the more official offerings from the workstation and mainframe manufacturers. I can't help observing, of course, the "free source" Unix-derived world seems to be suffering from exactly the same kind of fragmentation and strife that occurred and is still occurring in the commercial world.
|
||||
|
||||
LF: And the Big question about Linux. Have you ever used Linux? Well, If so, what's your opinion of it?
|
||||
I very much admire Linux's growth and vigor.
|
||||
|
||||
Dennis: I haven't actually used it for real--in the sense of depending on it for my own day-to-day computing--, I am afraid to admit. My own computational world is a strange blend of Plan 9, Windows, and Inferno. I very much admire Linux's growth and vigor. Occasionally, people ask me much the same question, but posed in a way that seems to expect an answer that shows jealousy or irritation about Linux vs. Unix as delivered and branded by traditional companies. Not at all; I think of both as the continuation of ideas that were started by Ken and me and many others, many years ago.
|
||||
|
||||
LF: And Microsoft.... What do you think about the monopoly this company currently has over desktop computing? In the past sci-fi films depicted a world dominated by macro-computers that rule all aspects of our daily life. The current reality has showed us a different picture. Computers, in many respects, have been relegated to a simple appliance. You, who developed a operating system thought for programmers, who lived all that sci-fi scene, and who imagined the actual computing situation, how do you imagine the future of computing? What place do you think has Inferno and Linux in it?
|
||||
|
||||
Dennis: That's two questions. Microsoft does have some sort of monopoly over desktop computing, but that's not the only interesting computing in the world. Both alternate ways of supplying software (like Linux) and bits of the world that don't get in the news as much as Windows or browser wars (like very high-performance computing, very reliable computing, very small computing) will all have a place. I trust that both Linux and Inferno will prosper.
|
||||
79
Zim/Interview/Leap_In_and_Try_Things.txt
Normal file
79
Zim/Interview/Leap_In_and_Try_Things.txt
Normal file
@@ -0,0 +1,79 @@
|
||||
Content-Type: text/x-zim-wiki
|
||||
Wiki-Format: zim 0.4
|
||||
Creation-Date: 2011-03-27T20:45:17+08:00
|
||||
|
||||
====== Leap In and Try Things ======
|
||||
Created Sunday 27 March 2011
|
||||
|
||||
Leap In and Try Things: Interview with Brian Kernighan
|
||||
Referred to as K&R
|
||||
{{../An_Interview_with_Brian_Kernighan/kernighan.jpg}}{{../An_Interview_with_Brian_Kernighan/c-book.jpg.gif}}{{../An_Interview_with_Brian_Kernighan/brian_kernighan_studentssm.jpg}}{{../An_Interview_with_Brian_Kernighan/brian_kernighan_princeton.jpg}}
|
||||
Referred to as "K&R"
|
||||
|
||||
Brian Kernighan (pronounced ker-ni-han), Professor of Computer Science at Princeton University, co-authored The C Programming Language, which has sold millions of copies and has been translated into 27 languages.
|
||||
|
||||
C is one of the most popular computer programming languages, and it has influenced nearly all languages in use today, including C++, C#, Java, Javascript, Perl, PHP, and Python.
|
||||
|
||||
Before becoming a full-time professor at Princeton, you had a long and brilliant career at AT&T Bell Labs. But let’s go back further. Can you tell us about your childhood and early interests?
|
||||
Professor Brian Kernighan
|
||||
|
||||
Professor Brian Kernighan
|
||||
|
||||
I was born in Canada and grew up in and around Toronto. My father was a chemical engineer, which gave me a bit of exposure to some kinds of science and engineering. I went to the University of Toronto in a program called “Engineering Physics”, which was meant for students who were pretty good in math and science but didn’t have any idea what they wanted to do. It was extremely tough because there was a heavy workload and a lot of material — academically, I don’t think I have ever worked as hard since. But it was a very good foundation for all kinds of later studies, and of course the experience of just working hard full time was good (though painful at the time). I didn’t really encounter computers until I was nearly done with my undergrad education, but when I did first start to play with computers, I found them great fun, and of course still do.
|
||||
|
||||
What events and decisions led you to join Bell Labs?
|
||||
|
||||
I was a grad student at Princeton, and was lucky enough to be able to spend a summer at MIT in 1966, working on Project MAC. I met some amazing people, I worked on the state of the art time-sharing system CTSS, helping to build the next version, Multics, and I learned a lot. The Multics project included not only MIT but Bell Labs, so I at least knew of people from Bell Labs who were contributing as well. The following summer, I got a job at Bell Labs, where I worked with Doug McIlroy, the inventor of macros (among other things) and got to know the people who were working on Multics at the Bell Labs end. That worked out well enough that I returned for a second summer, this time working with Shen Lin, a wonderful computer scientist and applied mathematician. Shen and I worked on an interesting problem, graph partitioning, that was my PhD thesis topic. When I finished at Princeton six months later, Bell Labs offered me a job in the same group as Doug and Shen, and I never even thought about another job — Bell Labs was an idyllic place, with great people, lots of resources, and freedom to do anything at all.
|
||||
|
||||
You are probably best known to the world for co-authoring with Dennis Ritchie The C Programming Language, which is generally known as “Kernighan and Ritchie”, or just “K&R”. Can you tell us how you two met and came to write the book together?
|
||||
|
||||
Dennis was one of the people who had been involved with Multics at Bell Labs. He and I were in the same group, though I don’t think we worked on anything together for several years. But at some point, I wrote a tutorial on how to program in B, which was the precursor language to C, and that led naturally to a tutorial on C when Dennis created that language. After a while, it seemed like a good idea to try a book on C, and I twisted Dennis’s arm into writing it with me. That was probably the smartest thing I ever did — Dennis is an exceptionally clear and elegant writer, and of course he knew the language better than anyone else, so the book was accurate and authoritative.
|
||||
|
||||
When writing the book, and afterwards, did you have an idea of what a major impact C would have on the computing world?
|
||||
|
||||
I certainly did not. I can’t speak for Dennis, but I’ll bet that he didn’t either.
|
||||
|
||||
What do you think made the book, and C, so successful?
|
||||
Lecturing
|
||||
|
||||
Lecturing
|
||||
|
||||
With C, Dennis managed to find a perfect balance between expressiveness and efficiency. It was just right for creating systems programs like compilers, editors, and even operating systems. C made it possible for a programmer to get close to the machine for efficiency but remain far enough away to avoid being tied to a specific machine. It was pretty clear how any C construct would be compiled, and it was possible to write quite good C compilers for almost any reasonable architecture. As a result, C became in effect a universal assembler: close enough to the machine to be cost effective, but far enough away that a C program could be compiled for and run well on any machine. I think the book has been successful in large part because of the success of C, though it probably helped that the book, like the language, is rather small and simple, and made it possible for people to do useful things quickly. And of course we were lucky in our timing, first with the appearance of comparatively cheap minicomputers like the PDP-11 and later with the arrival of the PC.
|
||||
|
||||
While writing the book, did you experience flashes of inspiration?
|
||||
|
||||
I don’t think there were any flashes of inspiration in book-writing, at least that I recall. In fact, I more remember long periods of trying to figure out the right way to explain some idea, like pointers or how to do input without pointers. It always required a lot of rewriting to find the right way to express something.
|
||||
|
||||
You also worked with Dennis Ritchie, and Ken Thompson, on UNIX. In fact, you came up with the name. Can you tell us that story?
|
||||
|
||||
I did not work with Ken and Dennis on Unix itself, though I was one of the first users and I’ve written some programs that became part of the core Unix tool set. As for the name, my memory (not always to be trusted) is that people were looking for a name, and I observed that Multics was “many of everything”, while Unix was “at most one of anything”; these were very weak puns on some Latin roots. I would have spelled it Unics just to be consistent with Multics, but someone else — no idea who — changed the spelling to the much better Unix.
|
||||
|
||||
It seems you have a passion for programming languages. What do you think of the languages in use today? How has C influenced them?
|
||||
|
||||
C has clearly influenced a number of languages. C++ is the most obvious; it adds a rich set of object-oriented features but is almost compatible at the source and object level with C. Java derives syntax and quite a few basic ideas from C. AWK steals syntax directly, as do languages like Javascript and Objective-C. Personally, I use a variety of languages, depending on what seems most appropriate for the task at hand. For the last few years, that has meant that I’ve written a lot of AWK, a fair amount of Python and Javascript, some Java, and occasional C and C++. At this point, these are starting to blur in my mind, and it takes a day or two to switch from one to another, just because of trivial syntactic issues and different libraries. I like them or dislike them all more or less about the same, depending on what I find easy or unnecessarily difficult at the moment. But I still think that if I could only use one language, I’d use C, just because I could build anything with it.
|
||||
|
||||
What led to your decision to join Princeton? How has this experience been different from the Labs?
|
||||
At Princeton (image created by Amit Chatwani)
|
||||
|
||||
At Princeton (image created by Amit Chatwani)
|
||||
|
||||
Bell Labs was a wonderful place for me for over thirty years. But that’s a long time to spend in one job. I had done occasional teaching over the years, and in 1996 I spent a full-time semester at Harvard, where with a huge class I learned that I could handle all of the mechanical challenges of teaching, and that it was still fun. So when Princeton made me an offer, it was not a leap into the unknown but a chance to do something that I really enjoyed, in a school that I really liked. I’ve been here ten years now, and although I loved Bell Labs, this has turned out to be even better. The job is different in almost every way. I spend much of my time during the school year trying to teach well, interacting with students, advising undergraduate research, and the like. But summers are a complete change of pace, and I can usually spend time as a working programmer somewhere; that keeps me in touch with real-world software development. I enjoy the annual cycle.
|
||||
|
||||
In your mind, what challenges do students face today, both internally and externally? What advice do you have for them?
|
||||
|
||||
I was lucky — computing was a very young field and there was a lot that could be done without great resources or even much more than a good idea. Computing is a lot more mature today, at least in some ways, so it’s probably a lot harder to find the low-hanging fruit. But at the same time, students today have resources beyond the imagining of people of my age — an Internet that connects everyone, hardware that costs almost nothing, and free languages and tools and development environments that make it possible to create sophisticated systems by plugging together components that others have created. So the job is perhaps harder but also easier. My advice is to leap in and try things. If you succeed, you can have enormous influence. If you fail, you have still learned something, and your next attempt is sure to be better for it.
|
||||
|
||||
What suggestions or tips would you offer to graduates who are trying to figure what they want to do in life?
|
||||
With students at University of Pennsylvania, 1995
|
||||
|
||||
With students at University of Pennsylvania, 1995
|
||||
|
||||
Do something you really enjoy doing. If it isn’t fun to get up in the morning and do your job or your school program, you’re in the wrong field. Not everyone is lucky enough to have a job or an academic program that rewards them continually, so perhaps your fun will come instead from some outside activity, or friends and family. But there has to be something that makes your life rewarding. Be open to as many experiences as you can, to maximize your chances of finding the really rewarding things. Dick (Richard) Hamming once told me, “Never say no.” That has to be applied with a bit of caution, of course, but the basic idea — try new things — is completely sound.
|
||||
|
||||
What advice would you offer to someone approaching retirement?
|
||||
|
||||
Don’t retire. Or, more precisely, don’t stop doing things. Keep learning and exploring. I retired from Bell Labs 10 years ago and took up teaching; it’s been more fun that I could have imagined and it’s probably added 10 years to my life. And if I retire from Princeton, I’ll try to find something new that has that same effect — something that keeps me active, involved, learning, and meeting new people.
|
||||
|
||||
Shifting gears a bit, can you share your thoughts on India and its recent tremendous growth, facilitated by computers and the internet?
|
||||
|
||||
All that technology will help, but fundamentally, it’s the people who matter. Regrettably, I have never been to India, but I have a large number of Indian friends. They are almost without exception smart, skilled, motivated, energetic, and really nice. More than a billion people like that? It’s amazing. India is going to have a great future.
|
||||
66
Zim/Interview/interview_with_Rich_Stevens,.txt
Normal file
66
Zim/Interview/interview_with_Rich_Stevens,.txt
Normal file
@@ -0,0 +1,66 @@
|
||||
Content-Type: text/x-zim-wiki
|
||||
Wiki-Format: zim 0.4
|
||||
Creation-Date: 2011-06-03T16:06:11+08:00
|
||||
|
||||
====== interview with Rich Stevens, ======
|
||||
Created 星期五 03 六月 2011
|
||||
http://www.kohala.com/start/unpv12e/interview.html
|
||||
Prentice Hall interview with Rich Stevens, author of Unix Programming, Volume 1: Networking APIs, Sockets and XTI, 2/e
|
||||
|
||||
October, 1997
|
||||
|
||||
Prentice Hall: How did you become involved in Unix networking, from a programming and author perspective?
|
||||
|
||||
Rich Stevens: During the 1980's, while I was at Health Systems International, we were doing Unix software development for a variety of platforms. We went through the normal sequences of hardware that most startups went through at that time: one VAX-11/750 running 4.2BSD, then a bigger VAX (785), then multiple VAXes (added an 8650), throw in some PCs running a flavor of operating systems (Venix, Xenix, DOS), and for good measure one IBM mainframe running VM.
|
||||
|
||||
Naturally, with multiple VAXes running 4.xBSD, you connect them together with an Ethernet and run TCP/IP, and TCP/IP was also available for the PC-based Unices and the mainframe. In addition to the standard utilities (ftp, rlogin) we started writing our own applications using sockets. Documentation was almost nonexistent (I had very worn copies of the two Leffler et al. documents from the 4.3BSD manual set) so when you needed an answer, you looked it up in the source code. After doing this for a while I realized that everything I was digging up should really be documented.
|
||||
|
||||
I really believe that my background is fundamental to the success of UNP and my other books. That is, I was not one of the developers at Berkeley or AT&T, so the writing of UNP was not a "memory dump". Everything that is in the book I had to dig out of somewhere and understand myself. This process of digging up the details and learning how things work leads down many side streets and to many dead ends, but is fundamental (I think) to understanding something new.
|
||||
|
||||
Many times in my books I have set out to write how something works, thinking I know how it works, only to write some test programs that lead me to things that I never knew. I try to convey some of these missteps in my books, as I think seeing the wrong solution to a problem (and understanding why it is wrong) is often as informative as seeing the correct solution.
|
||||
|
||||
Prentice Hall: How has the Unix network programming environment changed since the publication of the First Edition in 1990?
|
||||
|
||||
Rich Stevens: First, it is obvious that TCP/IP is the future, something that was not a given in 1990, with all the OSI hype that was taking place. Second, the Berkeley sockets interface has also become the de facto standard, despite X/Open's big push for TLI/XTI. Third, IP version 6 (IPv6) should be heavily used during the lifetime of the second edition, so there is a strong emphasis in the book for protocol-independent networking code, allowing applications to be developed for both IPv4 and IPv6. Lastly, the Posix.1g standard for both sockets and XTI is near final approval, so it was important for the rewrite to include this standard, along with the forthcoming Unix 98 standard.
|
||||
|
||||
Prentice Hall: How did the Second Edition evolve to 3 volumes and what will be covered in each volume?
|
||||
|
||||
Rich Stevens: When I started the rewrite over a year ago I started with the sockets chapter and things just grew and grew as I expanded this one chapter from the 1990 edition into over 20 chapters. My first attempt to make it "fit" was to take out the second half of the 1990 edition (applications) and make that a separate volume. I then realized that Chapter 3 from the 1990 edition (IPC, or interprocess communication) would not fit either. A hard decision was whether or not to include the six chapters on XTI. I decided to keep them because they are still used by some developers, often with protocols other than TCP/IP, and because XTI is part of the Posix.1g standard.
|
||||
|
||||
Volume 2 will be IPC, both the information from Chapter 3 of the 1990 edition along with coverage of the newer Posix.1b realtime IPC methods.
|
||||
|
||||
Volume 3 will be applications, some from the 1990 edition plus many new applications that have been developed since 1990.
|
||||
|
||||
Prentice Hall: Volume 1 is devoted primarily to sockets programming. How has this area changed since the publication of the First Edition and how is this change reflected in the book?
|
||||
|
||||
Rich Stevens: Besides more details on the topics from the 1990 edition, the following topics are new to the second edition: IPv4/IPv6 interoperability, protocol-independent name translation, routing sockets, multicasting, Posix threads, IP options, datalink access, client-server design alternatives (how many different ways are there to write a Web server?), virtual networks and tunneling, and network program debugging techniques.
|
||||
|
||||
Although most of these topics are demonstrated using the sockets API under Unix, many of these are what I call "network programming fundamentals" that can be implemented on systems other than Unix, and using an API other than sockets.
|
||||
|
||||
Prentice Hall: How would you describe the code in Volume 1 and how will your readers be able to access this code?
|
||||
|
||||
Rich Stevens: The code is available to anyone on the Internet and should compile easily on most current Unix systems. The majority of the 10,000 lines of C code are functions that one can use as building blocks (a network programming toolchest) inside their own network applications. Many of these functions help hide the differences between IPv4 and IPv6, and can aid the reader in developing protocol-independent code.
|
||||
|
||||
Prentice Hall: Why is there such a continuing demand for information on Unix when certain persons in the Pacific NorthWest would have us believe that NT is taking over the world?
|
||||
|
||||
Rich Stevens: As Scott McNealy says, Unix has been "dying" at an annual growth rate of about 20% per year for the past 10 years. Without starting a religious debate I can state that Unix is a time-tested, industrial strength system that has enough critical mass to continue for many, many years. It usage continues to grow in the commercial world (just witness how many of the world's Web servers run on Unix platforms) along with the free versions (e.g., the phenomenal growth of Linux). The absolute number of Unix systems will never equal that of Windows, but who cares? As a writer I just need to know that the audience for my books is not declining, and the total number of Unix programmers is increasing, not declining.
|
||||
|
||||
Prentice Hall: Every book which you have written has been extremely successful and enjoyed a long shelf life. Can you describe the process and effort you put into the development of your books?
|
||||
|
||||
Rich Stevens: I have a couple of credos when writing a book. I must also admit that two Prentice Hall books that have helped shape these beliefs are Kernighan and Ritchie's C book, and Kernighan and Pike's "The UNIX Programming Environment".
|
||||
|
||||
First, the book must be technically accurate. If you are going to show code, include the code directly from its source files and make certain that it compiles and runs correctly. If you are going to show terminal input and output, use the "script" program to produce a verbatim copy, to avoid any typographical errors. Having many competent technical reviewers helps here too, as no author knows everything about a subject.
|
||||
|
||||
Second, the book must be typeset nicely. I think this is critical for a programming book, so that the reader can tell what are commands, what the user types in, what are comments, and so on. The appearance of source code is also critical to understanding the code. To guarantee this I produce camera ready copy of all my books, a time consuming step, but worth it in the end, I believe. Books by Don Knuth and Brian Kernighan are a continual source of inspiration in this area.
|
||||
|
||||
Third, I try to demonstrate and not dictate. One small program is often worth a thousand words. One of the reviewers for the second edition of UNP complained about my usage of the third person when writing, as in "When we run this program". But I really envision many readers sitting down with the book at a terminal, with all the examples from the book on line, and going through the steps and examples outlined in the book. I use the term "we" to mean the reader and myself.
|
||||
|
||||
Fourth, I like pictures to explain something. If I cannot draw something, I don't understand it. This is especially critical in the area of networking: draw the processes involved, draw the networking connections (or IPC) between the processes, show which files are open by which process, and so on. That's where I came up with the term "Illustrated" for my TCP/IP series, to distinguish it from the other TCP/IP books in the market.
|
||||
|
||||
UNP also deserves the "Illustrated" adjective, as it contains lots and lots of figures. Whenever I need to understand something new, I have to draw a picture of what's involved, and I then include these pictures in the books.
|
||||
|
||||
Fifth, I firmly believe that my readers want all the details, not glossy overviews. That's why the second edition has grown so much from the 1990 edition. I think that most readers of the new edition will read only half of the book, but each reader will read a different half. This also means that it is essential that the book be usable as a reference, which means lots of references to other related sections of the book, and a thorough index.
|
||||
|
||||
Sixth, I never include anything I do not understand. I hate hand waving when I read a book (which often indicates that the author doesn't understand something) so I avoid it at all costs. When I hit something that I don't understand, I take a detour and learn something new. This often makes my books late by a few months, but I think accuracy and completeness are essential. Many times I start a section on some topic, allocating (say) three days to write it. Two weeks later I am finishing the section because I got side tracked on something that I needed to cover, but which I didn't understand completely.
|
||||
|
||||
© Prentice-Hall, Inc. A Simon & Schuster Company Upper Saddle River, New Jersey 07458
|
||||
Reference in New Issue
Block a user