Search Results

Search found 35 results on 2 pages for 'knuth'.

Page 1/2 | 1 2  | Next Page >

  • Don Knuth and MMIXAL vs. Chuck Moore and Forth -- Algorithms and Ideal Machines -- was there cross-pollination / influence in their ideas / work?

    - by AKE
    Question: To what extent is it known (or believed) that Chuck Moore and Don Knuth had influence on each other's thoughts on ideal machines, or their work on algorithms? I'm interested in citations, interviews, articles, links, or any other sort of evidence. It could also be evidence of the form of A and B here suggest that Moore might have borrowed or influenced C and D from Knuth here, or vice versa. (Opinions are of course welcome, but references / links would be better!) Context: Until fairly recently, I have been primarily familiar with Knuth's work on algorithms and computing models, mostly through TAOCP but also through his interviews and other writings. However, the more I have been using Forth, the more I am struck by both the power of a stack-based machine model, and the way in which the spareness of the model makes fundamental algorithmic improvements more readily apparent. A lot of what Knuth has done in fundamental analysis of algorithms has, it seems to me, a very similar flavour, and I can easily imagine that in a parallel universe, Knuth might perhaps have chosen Forth as his computing model. That's the software / algorithms / programming side of things. When it comes to "ideal computing machines", Knuth in the 70s came up with the MIX computer model, and then, collaborating with designers of state-of-the-art RISC chips through the 90s, updated this with the modern MMIX model and its attendant assembly language MMIXAL. Meanwhile, Moore, having been using and refining Forth as a language, but using it on top of whatever processor happened to be in the computer he was programming, began to imagine a world in which the efficiency and value of stack-based programming were reflected in hardware. So he went on in the 80s to develop his own stack-based hardware chips, defining the term MISC (Minimal Instruction Set Computers) along the way, and ending up eventually with the first Forth chip, the MuP21. Both are brilliant men with keen insight into the art of programming and algorithms, and both work at the intersection between algorithms, programs, and bare metal hardware (i.e. hardware without the clutter of operating systems). Which leads me to the headlined question... Question:To what extent is it known (or believed) that Chuck Moore and Don Knuth had influence on each other's thoughts on ideal machines, or their work on algorithms?

    Read the article

  • I'm a CS student, and honestly, I don't understand Knuth's books

    - by Raymond Ho
    I stumbled upon this quote from Bill Gates: "You should definitely send me a resume if you can read the whole thing." He was talking about The Art of Programming books. So I was pretty curious and want to read it all. But honestly, I don't understand it. I'm really not that intellectual. So this should be the reason why I can't understand it, but I am eager to learn. I'm currently reading Volume 1 about fundamental algorithms. Are there any books out there that are friendly for novices/slow people like me, which would help to build up my knowledge so that I can read Knuth's book with ease in the future?

    Read the article

  • Applying the Knuth-Plass algorithm (or something better?) to read two books with different length and amount of chapters in parallel

    - by user147133
    I have a Bible reading plan that covers the whole Bible in 180 days. For the most of the time, I read 5 chapters in the Old Testament and 1 or 2 (1.5) chapters in the New Testament each day. The problem is that some chapters are longer than others (for example Psalm 119 which is 7 times longer than a average chapter in the Bible), and the plan I'm following doesn't take that in count. I end up with some days having a lot more to read than others. I thought I could use programming to make myself a better plan. I have a datastructure with a list of all chapters in the bible and their length in number of lines. (I found that the number of lines is the best criteria, but it could have been number of verses or number of words as well) I then started to think about this problem as a line wrap problem. Think of a chapter like a word, a day like a line and the whole plan as a paragraph. The "length" of a word (a chapter) is the number of lines in that chapter. I could then generate the best possible reading plan by applying a simplified Knuth-Plass algorithm to find the best breakpoints. This works well if I want to read the Bible from beginning to end. But I want to read a little from the new testament each day in parallel with the old testament. Of course I can run the Knuth-Plass algorithm on the Old Testament first, then on the New Testament and get two separate plans. But those plans merged is not a optimal plan. Worst-case days (days with extra much reading) in the New Testament plan will randomly occur on the same days as the worst-case days in the Old Testament. Since the New Testament have about 180*1.5 chapters, the plan is generally to read one chapter the first day, two the second, one the third etc... And I would like the plan for the Old Testament to compensate for this alternating length. So I will need a new and better algorithm, or I will have to use the Knuth-Plass algorithm in a way that I've not figured out. I think this could be a interesting and challenging nut for people interested in algorithms, so therefore I wanted to see if any of you have a good solution in mind.

    Read the article

  • I'm a CS student, and honestly I don't understand Knuth's books..

    - by Raymond Ho
    I stumbled this quote from Bill Gates: "You should definitely send me a resume if you can read the whole thing." He was talking about The Art of Programming books.. So I was pretty curious and want to read it all but honestly, I don't understand it at all.. I'm really not that highly intellectual being.. So this should be the reason why I can't understand it, but I am eager to learn.. I'm currently reading volume 1 about fundamental algo.. So is there any books out there that are friendly to novice/slow people like me? So I can build up myself and hopefully in the future I can read Knuth's book at ease..

    Read the article

  • Performance and Optimization Isn’t Evil

    - by Reed
    Donald Knuth is a fairly amazing guy.  I consider him one of the most influential contributors to computer science of all time.  Unfortunately, most of the time I hear his name, I cringe.  This is because it’s typically somebody quoting a small portion of one of his famous statements on optimization: “premature optimization is the root of all evil.” I mention that this is only a portion of the entire quote, and, as such, I feel that Knuth is being quoted out of context.  Optimization is important.  It is a critical part of every software development effort, and should never be ignored.  A developer who ignores optimization is not a professional.  Every developer should understand optimization – know what to optimize, when to optimize it, and how to think about code in a way that is intelligent and productive from day one. I want to start by discussing my own, personal motivation here.  I recently wrote about a performance issue I ran across, and was slammed by multiple comments and emails that effectively boiled down to: “You’re an idiot.  Premature optimization is the root of all evil.  This doesn’t matter.”  It didn’t matter that I discovered this while measuring in a profiler, and that it was a portion of my code base that can take “many hours to complete.”  Even so, multiple people instantly jump to “it’s premature – it doesn’t matter.” This is a common thread I see.  For example, StackOverflow has many pages of posts with answers that boil down to (mis)quoting Knuth.  In fact, just about any question relating to a performance related issue gets this quote thrown at it immediately – whether it deserves it or not.  That being said, I did receive some positive comments and emails as well.  Many people want to understand how to optimize their code, approaches to take, tools and techniques they can use, and any other advice they can discover. First, lets get back to Knuth – I mentioned before that Knuth is being quoted out of context.  Lets start by looking at the entire quote from his 1974 paper Structured Programming with go to Statements: “We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified.” Ironically, if you read Knuth’s original paper, this statement was made in the middle of a discussion of how Knuth himself had changed how he approaches optimization.  It was never a statement saying “don’t optimize”, but rather, “optimizing intelligently provides huge advantages.”  His approach had three benefits: “a) it doesn’t take long” … “b) the payoff is real”, c) you can “be less efficient in the other parts of my programs, which therefore are more readable and more easily written and debugged.” Looking at Knuth’s premise here, and reading that section of his paper, really leads to a few observations: Optimization is important  “he will be wise to look carefully at the critical code” Normally, 3% of your code – three lines out of every 100 you write, are “critical code” and will require some optimization: “we should not pass up our opportunities in that critical 3%” Optimization, if done well, should not be time consuming: “it doesn’t take long” Optimization, if done correctly, provides real benefits: “the payoff is real” None of this is new information.  People who care about optimization have been discussing this for years – for example, Rico Mariani’s Designing For Performance (a fantastic article) discusses many of the same issues very intelligently. That being said, many developers seem unable or unwilling to consider optimization.  Many others don’t seem to know where to start.  As such, I’m going to spend some time writing about optimization – what is it, how should we think about it, and what can we do to improve our own code.

    Read the article

  • How can I use the Fisher-Yates shuffle while ensuring my permutation is even?

    - by Mithrax
    I'm interested making an implementation of the 14-15 puzzle: I'm creating an array with the values 0 - 15 in increasing order: S = { 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15 } Now, what I want to do is shuffle them to create a new instance of the puzzle. However, I know that if I create a board with an "odd permutation" than it is unsolvable. Wikipedia says I need to create the puzzle with an even permutation. I believe this means that I simply have to do ensure I do an even number of swaps? How would I modify Fisher-Yates so I ensure I end up with an even permutation at the end? If I do a swap for every element in the array that would be 16 swaps which I believe would be an even permutation. However, do I need to be concerned about swapping with itself? Is there any other way to ensure I have a valid puzzle?

    Read the article

  • How can I ensure that when I shuffle my puzzle I still end up with an even permutation?

    - by Mithrax
    I'm interested making an implementation of the 14-15 puzzle: I'm creating an array with the values 0 - 15 in increasing order: S = { 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15 } Now, what I want to do is shuffle them to create a new instance of the puzzle. However, I know that if I create a board with an "odd permutation" than it is unsolvable. Wikipedia says I need to create the puzzle with an even permutation. I believe this means that I simply have to do ensure I do an even number of swaps? How would I modify Fisher-Yates so I ensure I end up with an even permutation at the end? If I do a swap for every element in the array that would be 16 swaps which I believe would be an even permutation. However, do I need to be concerned about swapping with itself? Is there any other way to ensure I have a valid puzzle?

    Read the article

  • avconv gets killed if mkv has subtitles

    - by Lukas Knuth
    What I'm trying to do is to take a movie (in an Matroska container), convert all audio tracks to AC3 and don't touch anything else. I'm using this line: avconv -i infile.mkv -map 0 -vcodec copy -scodec copy -acodec ac3 -ab 256k outfile.mkv This works fine, except when there are subtitles embedded. Then, after some time processing with no progress, avconv just "dies" (output shortened, these seem to be the interesting parts): [matroska,webm @ 0xf867a0] max_analyze_duration reached [matroska,webm @ 0xf867a0] Estimating duration from bitrate, this may be inaccurate ... Incompatible sample format 's16' for codec 'ac3', auto-selecting format 'flt' ... Stream #0.0(eng): Video: H264 / 0x34363248, yuv420p, 1280x536 [PAR 1:1 DAR 160:67], q=2-31, 1k tbn, 1k tbc (default) Stream #0.1(ger): Audio: ac3, 48000 Hz, 5.1, flt, 256 kb/s (default) Stream #0.2(eng): Audio: ac3, 48000 Hz, 5.1, flt, 256 kb/s Stream #0.3(ger): Subtitle: dvdsub (default) (forced) Metadata: title : forced Stream #0.4(ger): Subtitle: dvdsub Metadata: title : complete Stream mapping: Stream #0:0 -> #0:0 (copy) Stream #0:1 -> #0:1 (dca -> ac3) Stream #0:2 -> #0:2 (dca -> ac3) Stream #0:3 -> #0:3 (copy) Stream #0:4 -> #0:4 (copy) Input stream #0:2 frame changed from rate:48000 fmt:s16 ch:6 to rate:48000 fmt:flt ch:6 Input stream #0:1 frame changed from rate:48000 fmt:s16 ch:6 to rate:48000 fmt:flt ch:6 frame= 2606 fps=1303 q=-1.0 size= 3kB time=107.36 bitrate= 0.2kbits/s ... frame=96141 fps=813 q=-1.0 size= 2195806kB time=2807.04 bitrate=6408.2kbits/s frame=96251 fps=810 q=-1.0 size= 2195806kB time=2807.04 bitrate=6408.2kbits/s ... frame=97015 fps=397 q=-1.0 size= 2195806kB time=2807.04 bitrate=6408.2kbits/s Getötet ["Killed", in English] I have no idea why this happens, as there is no error-output. I'd like to just copy the subtitles over, not touch them at all. If that won't work, they can be completely dropped.

    Read the article

  • Finding useful crash-information in Windows 8 Consumer Preview

    - by Lukas Knuth
    I'm currently diving into C# and wanted to play around with the new Metro-styled-applications introduced with Windows 8, so I updated my Windows 7 to Windows 8 Consumer Preview. The problem I'm facing right now is, that the system freezes after 3-5 minutes. It does not take any input from the keyboard or mouse and it does not recover (at least not in less then 10 minutes). Since I have a background in Linux, I'd like to find some information about the cause of the freeze, but I have no idea where to search. I checked the system-logs (under "System Control" - "Management") but they only record that the system was shut down unexpectedly (doe to the face that I held down the power-button to reboot the PC). There is no useful crash-information in there. I don't want to spend hours on randomly reinstalling drivers and doing things that "might help". Isn't there any place I can find some useful information about the freeze? Before you ask: I installed Windows 8 as an updated on my old Windows 7 installation (which worked fine by the way). My hardware fits the minimum requirements (specs can be found here, the MacMini 3,1 model with 2GHz processor). I have updated the graphics-card drivers to the newest Windows 8 drivers from nVidia.

    Read the article

  • The Art of Computer Programming - To read or not to read?

    - by Zannjaminderson
    There are lots of books about programming out there, and it seems Code Complete is pretty much at the top of most people's list of "must-read programming books", but what about The Art of Computer Programming by Donald Knuth? I'm a busy person, between work and a young family I don't have a ton of free time, so I have to be picky about how I use it. I'm wondering - has anybody here read 'TAOCP'? If so, is it worth making time to read or would some other book or more on-the-side programming like pet projects or contributing to open source be a better use of my time in terms of professional development? DISCLAIMER - For those of you who sport "Knuth is my homeboy" t-shirts, don't get me wrong - I want to read it, but I'm just wondering if it should be right at the top of my priority list or if something else should come first.

    Read the article

  • Fastest gap sequence for shell sort ?

    - by Tony
    According to Marcin Ciura's Optimal (best known) sequence of increments for shell sort algorithm. The best sequence for shellsort is 1, 4, 10, 23, 57, 132, 301, 701... But how can I generate such a sequence ? In Marcin Ciura's paper he said : Both Knuth’s and Hibbard’s sequences are relatively bad, because they are defined by simple linear recurrences but most algorithm books I searched , they all tend to use Knuth’s sequence : k = 3k + 1 ; because it's easy to generate , what's your way of generating shellsort sequence ?

    Read the article

  • Which reference provides your definition of "elegant" or "beautiful" code?

    - by Donnied
    This question is phrased in a very specific way - it asks for references. There was a similar question posted which was closed because it was considered a duplicate to a good code question. The Programmers FAQ points out that answers should have references - or its just an unproductive sharing of (seemingly) baseless opinions. There is a difference between shortest code and most elegant code. This becomes clear in several seminal texts: Dijkstra, E. W. (1972). The humble programmer. Communications of the ACM, 15(10), 859–866. Kernighan, B. W., & Plauger, P. J. (1974). Programming style: Examples and counterexamples. ACM Comput. Surv., 6(4), 303–319. Knuth, D. E. (1984). Literate programming. The Computer Journal, 27(2), 97–111. doi:10.1093/comjnl/27.2.97 They all note the importance of clarity over brevity. Kernighan & Plauger (1974) provide descriptions of "good" code, but "good code" is certainly not synonymous with "elegant". Knuth (1984) describes the impo rtance of exposition and "excellence of style" to elegant programs. He cites Hoare - who describes that code should be self documenting. Dijkstra (1972) indicates that beautiful programs optimize efficiency but are not opaque. This sort of conversation is qulaitatively different than a random sharing of opinions. Therefore, the question - Which reference provides your definition of "elegant" or "beautiful" code? "Which *reference*" is not subjective - anything else will most likely shut the thread down, so please supply *references* not opinions.

    Read the article

  • Why do old programming languages continue to be revised?

    - by SunAvatar
    This question is not, "Why do people still use old programming languages?" I understand that quite well. In fact the two programming languages I know best are C and Scheme, both of which date back to the 70s. Recently I was reading about the changes in C99 and C11 versus C89 (which seems to still be the most-used version of C in practice and the version I learned from K&R). Looking around, it seems like every programming language in heavy use gets a new specification at least once per decade or so. Even Fortran is still getting new revisions, despite the fact that most people using it are still using FORTRAN 77. Contrast this with the approach of, say, the typesetting system TeX. In 1989, with the release of TeX 3.0, Donald Knuth declared that TeX was feature-complete and future releases would contain only bug fixes. Even beyond this, he has stated that upon his death, "all remaining bugs will become features" and absolutely no further updates will be made. Others are free to fork TeX and have done so, but the resulting systems are renamed to indicate that they are different from the official TeX. This is not because Knuth thinks TeX is perfect, but because he understands the value of a stable, predictable system that will do the same thing in fifty years that it does now. Why do most programming language designers not follow the same principle? Of course, when a language is relatively new, it makes sense that it will go through a period of rapid change before settling down. And no one can really object to minor changes that don't do much more than codify existing pseudo-standards or correct unintended readings. But when a language still seems to need improvement after ten or twenty years, why not just fork it or start over, rather than try to change what is already in use? If some people really want to do object-oriented programming in Fortran, why not create "Objective Fortran" for that purpose, and leave Fortran itself alone? I suppose one could say that, regardless of future revisions, C89 is already a standard and nothing stops people from continuing to use it. This is sort of true, but connotations do have consequences. GCC will, in pedantic mode, warn about syntax that is either deprecated or has a subtly different meaning in C99, which means C89 programmers can't just totally ignore the new standard. So there must be some benefit in C99 that is sufficient to impose this overhead on everyone who uses the language. This is a real question, not an invitation to argue. Obviously I do have an opinion on this, but at the moment I'm just trying to understand why this isn't just how things are done already. I suppose the question is: What are the (real or perceived) advantages of updating a language standard, as opposed to creating a new language based on the old?

    Read the article

  • What are the most known arbitrary precision arithmetic implementation approaches?

    - by keykeeper
    I'm going to write a class library for .NET which provide an implementation of arbitrary precision arithmetic for integer, rational and maybe complex numbers. What best known approaches should I become familiar with? I tried to start with Knuth's TAOCP Vol.2 (Seminumerical Algorithms, Chapter 4 – Arithmetic) but it's too complicated. At least I couldn't get the ideas in a relatively short period of time.

    Read the article

  • How to attend one off lectures? [closed]

    - by Senthil Kumaran
    Many times, we come across one-off lectures from famous Computer Scientists. Last year, I came across one by Ms. Barbara Liskov, but I could not go because the University Hall was bit far. Tomorrow, there is one by Dr. Knuth! Now the problem I am facing is, "I don't know much about the material he is going to talk", so I am not sure if I should plan and I fear it will be like going for a "temple". :) What is your advise and general strategy that you have followed whenever you wanted to attend a talk or lecture wherein, it may have been worthwhile if it were an introductory tutorial, but instead you were sitting in an advanced 1 hour lecture by a famous scientist.

    Read the article

  • GOTO still considered harmful?

    - by Kyle Cronin
    Everyone is aware of Dijkstra's Letters to the editor: go to statement considered harmful (also here .html transcript and here .pdf) and there has been a formidable push since that time to eschew the goto statement whenever possible. While it's possible to use goto to produce unmaintainable, sprawling code, it nevertheless remains in modern programming languages. Even the advanced continuation control structure in Scheme can be described as a sophisticated goto. What circumstances warrant the use of goto? When is it best to avoid? As a followup question: C provides a pair of functions, setjmp and longjmp, that provide the ability to goto not just within the current stack frame but within any of the calling frames. Should these be considered as dangerous as goto? More dangerous? Dijkstra himself regretted that title, of which he was not responsible for. At the end of EWD1308 (also here .pdf) he wrote: Finally a short story for the record. In 1968, the Communications of the ACM published a text of mine under the title "The goto statement considered harmful", which in later years would be most frequently referenced, regrettably, however, often by authors who had seen no more of it than its title, which became a cornerstone of my fame by becoming a template: we would see all sorts of articles under the title "X considered harmful" for almost any X, including one titled "Dijkstra considered harmful". But what had happened? I had submitted a paper under the title "A case against the goto statement", which, in order to speed up its publication, the editor had changed into a "letter to the Editor", and in the process he had given it a new title of his own invention! The editor was Niklaus Wirth. A well thought out classic paper about this topic, to be matched to that of Dijkstra, is Structured Programming with go to Statements (also here .pdf), by Donald E. Knuth. Reading both helps to reestablish context and a non-dogmatic understanding of the subject. In this paper, Dijkstra's opinion on this case is reported and is even more strong: Donald E. Knuth: I believe that by presenting such a view I am not in fact disagreeing sharply with Dijkstra's ideas, since he recently wrote the following: "Please don't fall into the trap of believing that I am terribly dogmatical about [the go to statement]. I have the uncomfortable feeling that others are making a religion out of it, as if the conceptual problems of programming could be solved by a single trick, by a simple form of coding discipline!"

    Read the article

  • Algorithm for Text Wrapping Within a Shape

    - by devongovett
    I am looking for an algorithm to wrap text within a non-rectangular shape, preferably based on the Knuth and Plass algorithm. The hardest part of this is that the lines may have different heights due to differing font sizes in the text. The image below is an example of what the algorithm should be able to generate. Thanks for any help!

    Read the article

  • Nuggets of wisdom?

    - by Bill Karwin
    There are many quotes from famous computer scientists that have become the wisdom that guides our profession. For example: "Premature optimization is the root of all evil in programming." Donald Knuth (citing Hoare's Dictum) "Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?" Brian Kernighan And so on. My question is, what are your favorite words of wisdom about programming from someone who is not famous? Was it a friend, a coworker, or a teacher, or a family member? For example, a technical writer friend of mine said: "You can't get the right answers unless you ask the right questions." Thanks for all the contributions! The answer I selected was (a) specifically coding-related, and (b) stated by someone who is not technically famous (though he has a popular blog and a podcast and runs StackOverflow). I.e. he's no Bill Gates or Yogi Berra.

    Read the article

  • Favorite Programmer Quotes…

    - by SGWellens
      "A computer once beat me at chess, but it was no match for me at kick boxing." — Emo Philips   "There are only 10 types of people in the world, those who understand binary and those who don't. " – Unknown.   "Premature optimization is the root of all evil." — Donald Knuth   "I should have become a doctor; then I could bury my mistakes." — Unknown   "Code softly and carry a large backup thumb drive." — Me   "Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live." — Martin Golding   "DDE…the protocol from hell"— Charles Petzold   "Just because a thing is new don't mean that it's better" — Will Rogers   "The mark of a mature programmer is willingness to throw out code you spent time on when you realize it's pointless." — Bram Cohen   "A good programmer is someone who looks both ways before crossing a one-way street." — Doug Linder   "The early bird may get the worm but it's the second mouse that gets the cheese." — Unknown   I hope someone finds this amusing. Steve Wellens CodeProject

    Read the article

  • Lots of great stuff going on with Oracle Secure Global Desktop!

    - by Chris Kawalek
    You're probably familiar with Oracle Secure Global Desktop, our solution for providing secure, browser-based access to Oracle Applications and other enterprise software. It's a fantastic product and one I've been personally involved with for nearly a decade! I wanted to give you a quick update on all the fantastic things that are going on with it: First, we have done a few videos with Oracle's Mohan Prabhala at trade shows recently. You can get a quick product refresher and an update on the latest new features by watching these: Next, we talked at length with Brian Madden and Gabe Knuth on Brian and Gabe LIVE about Oracle Secure Global Desktop. Click here or on the screenshot below to go to the brianmadden.com video. Part 1 focuses on Oracle Secure Global Desktop. Listen toward the end for Brian to say, “I kinda want this actually at TechTarget right now.” The analysts are talking about us, too. When we released Oracle Secure Global Desktop 4.7, Chris Wolf over at Gartner had this to say on Twitter. Last, just a quick reminder for existing Oracle Applications customers that Oracle Secure Global Desktop is easy for you to leverage for secure application access. Oracle Secure Global desktop is certified for use with Oracle browser-based applications such as Primavera, E-Business Suite and with Exalogic. Steven Chan over at the E-Business Suite Technology blog gives a great explanation of how Oracle Secure Global Desktop works with E-Business Suite, as an example. As the title says, lots of great stuff going on! -Chris

    Read the article

  • I want to build a Virtual Machine, are there any good references?

    - by Michael Stum
    I'm looking to build a Virtual Machine as a platform independent way to run some game code (essentially scripting). The Virtual Machines that I'm aware of in games are rather old: Infocom's Z-Machine, LucasArts' SCUMM, id Software's Quake 3. As a .net Developer, I'm familiar with the CLR and looked into the CIL Instructions to get an overview of what you actually implement on a VM Level (vs. the language level). I've also dabbled a bit in 6502 Assembler during the last year. The thing is, now that I want¹ to implement one, I need to dig a bit deeper. I know that there are stack based and register based VMs, but I don't really know which one is better at what and if there are more or hybrid approaches. I need to deal with memory management, decide which low level types are part of the VM and need to understand why stuff like ldstr works the way it does. My only reference book (apart from the Z-Machine stuff) is the CLI Annotated Standard, but I wonder if there is a better, more general/fundamental lecture for VMs? Basically something like the Dragon Book, but for VMs? I'm aware of Donald Knuth's Art of Computer Programming which uses a register-based VM, but I'm not sure how applicable that series still is, especially since it's still unfinished? Clarification: The goal is to build a specialized VM. For example, Infocom's Z-Machine contains OpCodes for setting the Background Color or playing a sound. So I need to figure out how much goes into the VM as OpCodes vs. the compiler that takes a script (language TBD) and generates the bytecode from it, but for that I need to understand what I'm really doing. ¹ I know, modern technology would allow me to just interpret a high level scripting language on the fly. But where is the fun in that? :) It's also a bit hard to google because Virtual Machines is nowadays often associated with VMWare-type OS Virtualization...

    Read the article

1 2  | Next Page >