Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Re: Re: about Coolness, Impatience and Complexity

by no_slogan (Deacon)
on Jul 13, 2001 at 00:06 UTC ( [id://96167]=note: print w/replies, xml ) Need Help??


in reply to Re: about Coolness, Impatience and Complexity
in thread about Coolness, Impatience and Complexity

Good point. Turing machines can do any polynomial-time computation in polynomial time, but they might take a higher order polynomial length of time than some other type of machine. I was trying to compose a snappy reply, and got sloppy.

Still, when we talk about "polynomial time", what we really mean is "polynomial in the length of the problem using a reasonable encoding scheme." Not "polynomial in the number of entities in the input set," though we sometimes ignore the distinction. If the point and the line in your geometrical computation are specified with 512 bit numbers, that's a larger problem than if they were only 16 bits. You can't reduce that - it's information theory.

Any physical machine you build is going to have a finite number of bits of processing power available to it. We can't make computers out of infinitesimally thin Euclidian line segments. A quantum computer might be able to consider a virtually unlimited number of possibilities at the same time, but it has a limited number of qubits and they have to collapse down into one answer before it can be printed out on the screen.

If we start talking about machines that are impossible to build, are we still doing computer science? I don't say this to be facetious, but just to ask what computer science really is.

  • Comment on Re: Re: about Coolness, Impatience and Complexity

Replies are listed 'Best First'.
Re: about Coolness, Impatience and Complexity
by Abigail (Deacon) on Jul 13, 2001 at 13:59 UTC
    Still, when we talk about "polynomial time", what we really mean is "polynomial in the length of the problem using a reasonable encoding scheme." Not "polynomial in the number of entities in the input set," though we sometimes ignore the distinction. If the point and the line in your geometrical computation are specified with 512 bit numbers, that's a larger problem than if they were only 16 bits. You can't reduce that - it's information theory.
    No, what we mean is "polynomial in the size of the input". That could be bits if your computational model asks for that. If your computational model is geometry, then your input might consist of points and lines. Not bits. Bits don't play part in geometry. They might play a part in an implementation of the model, but not in the computational model itself. Yes, this requires a level of abstraction, but that's why we call it "science". ;-)
    Any physical machine you build is going to have a finite number of bits of processing power available to it. We can't make computers out of infinitesimally thin Euclidian line segments. A quantum computer might be able to consider a virtually unlimited number of possibilities at the same time, but it has a limited number of qubits and they have to collapse down into one answer before it can be printed out on the screen.
    All true, but not really relevant. If we have an O (f (N)) geometric algorithm (I graduated in the closely related fields of datastructures and computational geometry, hence bringing up geometry over and over again) and we want to implement it on a real machine, we might have to multiply or divide by a factor of log N (or something else) somewhere to get a running time expressed in the number of bits of the input. [1] But we'll have the same conversion for another algorithm. Hence, the relation of running time (and space) complexities in one computational model stay the same in the other, assuming you use the same implementation. Of course, in a typical implementation numbers are limited to 32, 64, 128 or some other number of bits, which is a constant disappearing in the big-Oh anyway.
    If we start talking about machines that are impossible to build, are we still doing computer science? I don't say this to be facetious, but just to ask what computer science really is.
    Well, it was you who brought up Turing Machine. ;-) Now, have you ever seen a Turing Machine? Not only are they impossible to build, if they would exist they are quite impractical to do real computation on. But much of the theory of Computer Science is founded on Turing Machines. Take away Turing Machines, and Computer Science no longer is the Computer Science we all know.

    Do not forget the words of one of the most famous Computer Scientists of all times, Edsgar Dijkstra: computer science is as much about computers as astronomy is about telescopes. Computer Science is a big misnomer; the French word is much better: Informatique (I hope I spelled that right), which is described by the French academy of Science as the science of processing, possibly by using automata, of information, where information is the is knowledge and data. I'm sure it sounds a lot better in French. ;-) There are a few European departments that use the term "Computing Science" when translating their department to English. I prefer Computing Science over Computer Science as well.

    [1]I'm using a lot of handwaving here. An algorithm on N numbers each consisting of M bits (and hence having size N*M) doesn't need to have the same running time when it's run on an input of M numbers each consisting of N bits (which also has size N*M).

    -- Abigail

      Now, have you ever seen a Turing Machine? Not only are they impossible to build, if they would exist they are quite impractical to do real computation on.
      Bosh. The only reason we can't make a perfect Turing machine is that we can't actually give it an infinite length of tape. We have a well developed theory that tells us what computations can and can't be done by a Turing machine with a finite length of tape. In many cases, it's easier to talk about one with an infinite length, so we do that instead, but at least we know what we're getting ourselves into.

      Sorry, but I don't agree with Dijkstra. It's true that astronomy is not about telescopes. Astronomy is about stars and nebulae and planets and things, and telescopes are just the tools that let us study them. But computer science is about computing, and things that compute are computers, by definition. So computer science is all about computers - not about PCs and Macs and Sparc workstations in particular, but about computers in general.

      And what is computing? Pushing information around. The flow of information is governed by information theory, which tells us that information is very nearly a physical thing. We usually deal with information in the form of bits, but it doesn't have to be stored that way. A given quantity of information is going to take up a finite minimum amount of time and space, no matter what you do with it. Getting as close as possible to the minimum is an engineering problem that we'll be workng on forever. But it's important to keep in mind that we're dealing with stuff that's not infinitely compressible. Information is solid.

      Of course, in a typical implementation numbers are limited to 32, 64, 128 or some other number of bits, which is a constant disappearing in the big-Oh anyway.
      In many cases, you can make the number of bits disappear into the big O, because the numbers don't need to be all that large. However, the reason this conversation started is that I was talking about implementing arbitrary precision arithmetic, which is not one of those cases. If I'm planning on using a raid array to hold the bits of a number, I'd better not count on the number of bits disappearing into a constant factor. Maybe I want to multiply two 10-terabyte numbers together for some reason. For you to claim that you have a geometrical computing model that ignores the limits of information theory and can do it with the same amount of effort as 2*2 is, quite frankly, silly.

      I don't think there's as much distinction as you think between "computer science" and "computing science". Before, I said, "why talk about computers that can't possibly be built?" Now, I say, "why talk about computations that can't possibly be performed?"

        We have a well developed theory that tells us what computations can and can't be done by a Turing machine with a finite length of tape.
        Please stick to the subject. The subject isn't computability, a domain in which Turing Machines frequent, but run time complexity of algorithms, of which you claimed where measured how fast they can be done on Turing Machines - but then you later claimed that any device that couldn't be build was irrelevant. But yeah, we could build a machine with a finite tape. Now, how many of them do you have laying around?

        But computer science is about computing, and things that compute are computers, by definition.
        Not everything that is about computing needs to be done by a "thing"! Are you dismissing the majority of physics as well because the models they use don't exist in the real world? It's Computer SCIENCE not electronics.
        And what is computing? Pushing information around. The flow of information is governed by information theory, which tells us that information is very nearly a physical thing. We usually deal with information in the form of bits, but it doesn't have to be stored that way. A given quantity of information is going to take up a finite minimum amount of time and space, no matter what you do with it. Getting as close as possible to the minimum is an engineering problem that we'll be workng on forever. But it's important to keep in mind that we're dealing with stuff that's not infinitely compressible. Information is solid.
        So what? Pushing bits around is indeed an engineering problem. Just like getting measurements of stars, nebulae and planets is an engineering problem, and has nothing to do with astronomy. Computer Science isn't about how to move bits around - Computer Science is about where to move "the bits" to. And note that I'm using quotes here, because Computer Science doesn't necessary deal with physical bits. It all depends on your computational model what your basic units are.

        I don't think there's as much distinction as you think between "computer science" and "computing science".
        I'm sorry, but did I say there was any difference? I don't think so. All I said was that some people think it's a better description of the same thing.

        Before, I said, "why talk about computers that can't possibly be built?" Now, I say, "why talk about computations that can't possibly be performed?
        Sir, you have never been a scientist, you are not a scientist, and you never will be a scientist. You just don't get it. If people always would have had that attitude, we would still live in the stone age.

        -- Abigail

        What is CS about?

        I agree with Abigail, CS is not about computers. CS is about how we work together to organize ideas and concepts to be able to specify the operation of something incredibly complex. It is not about the incidental details of what is actually going to happen. It is about the principles and details that humans need to understand to get it to happen.

        And so from the point of view of the programmer it is fundamentally irrelevant whether the underlying computation is being carried out by a Turing machine, a RISC processor, an x86 chip, or a Lisp machine. For a human to get things to happen it is essential that the human stop thinking at a low level and start creating useful abstractions. And understanding and manipulating those abstractions means forgetting about the operations that are being performed and thinking about the process of humans trying to comprehend and direct the process.

        You don't believe me? Well what is Dijkstra best known for? Go To Considered Harmful. Read it. It is a famous paper. When the ACM decided to put some famous papers that had appeared in their magazines online, it was the second one they chose. It started a famous debate, and even people who have not read and understood the paper know that goto is something you are supposed to avoid.

        You haven't read it yet? Please do. Or else what I am about to say will make no sense.

        OK, you just read one of the most influential papers ever written in CS. And not just influential in the abstract. It completely revolutionized the practice of programming. If there is an essence to the study of CS, that paper of all papers should reflect it.

        What did it have to do with computers?

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://96167]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others browsing the Monastery: (5)
As of 2024-04-23 20:16 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found