Search Results

Search found 908 results on 37 pages for 'the worst shady'.

Page 2/37 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • What are the worst examples of moral failure in the history of software engineering?

    - by Amanda S
    Many computer science curricula include a class or at least a lecture on disasters caused by software bugs, such as the Therac-25 incidents or Ariane 5 Flight 501. Indeed, Wikipedia has a list of software bugs with serious consequences, and a question on StackOverflow addresses some of them too. We study the failures of the past so that we don't repeat them, and I believe that rather than ignoring them or excusing them, it's important to look at these failures squarely and remind ourselves exactly how the mistakes made by people in our profession cost real money and real lives. By studying failures caused by uncaught bugs and bad process, we learn certain lessons about rigorous testing and accountability, and we make sure that our innocent mistakes are caught before they cause major problems. There are kinds of less innocent failure in software engineering, however, and I think it's just as important to study the serious consequences caused by programmers motivated by malice, greed, or just plain amorality. Thus we can learn about the ethical questions that arise in our profession, and how to respond when we are faced with them ourselves. Unfortunately, it's much harder to find lists of these failures--the only one I can come up with is that apocryphal "DOS ain't done 'til Lotus won't run" story. What are the worst examples of moral failure in the history of software engineering?

    Read the article

  • Worst SysAdmin Accident

    - by Ward
    In line with the question about Best sysadmin accident, what's the worst accident you've been involved in? Unlike the previous question, I mean "worst" in the sense of most system damage or actual harm to people. I'll start with mine: We have two remote wiring closets that are at the end of a 100-foot corridor which has a metal grate for the floor. After we had Cat6 cable installed, the contractors cleaned up all the debris that dropped through the grating to the concrete 3 feet below. A co-worker and I entered the corridor to check on the progress one day but were distracted and didn't notice that a piece of grating had been moved aside. My buddy stepped into air and his chest slammed into the steel crossbar. He was winded and sore enough to take a couple days off, but luckily the steel beam had rounded edges and the size of the opening was such that he didn't smack his head into it or the floor below. Obviously we learned that areas where the floor is partially removed need to be flagged.

    Read the article

  • The best programmer is N times more effective than the worst? Who Cares?

    - by StevenWilkins
    There is a latent belief in programming that the best programmer is N times more effective than the worst. Where N is usually between 10 and 100. Here are some examples: http://www.devtopics.com/programmer-productivity-the-tenfinity-factor/ http://www.joelonsoftware.com/articles/HighNotes.html http://haacked.com/archive/2007/06/25/understanding-productivity-differences-between-developers.aspx There is some debate as to whether or not it's been proven: http://morendil.github.com/folklore.html I'm confident in the accuracy of these statements: The best salesmen in the world are probably 10-100 times better than the worst The best drivers in the world are probably 10-100 times better than the worst The best soccer players in the world are probably 10-100 times better than the worst The best CEOs in the world are probably 10-100 times better than the worst In some cases, I'm sure the difference is greater. In fact, you could probably say that The best [insert any skilled profession here] in the world are probably 10-100 times better than the worst We don't know what N is for the rest of these professions, so why concern ourselves with what the actual number is for programming? Can we not just say that the number is large enough so that it's very important to hire the best people and move on already?

    Read the article

  • What do you consider your "worst" hack?

    - by magcius
    What is the worst hack you've ever written? This is different from What is the worst code you've ever written?, because that, as I understand it, revolves around code later called worst because of ignorance. hack: code written, knowing it is horrible code, for the sake of convenience, deadlines, working around another broken system or bug, etc., but not ignorance. If you want, you can describe your co-workers' reaction, how bad your hospital bill was after showing them the code, if you felt disappointed in yourself for coming up with it or proud of yourself for coming up with a creative and clever solution. This doesn't have to be shipped code, this could also be code written for personal purposes.

    Read the article

  • Worse is better. Is there an example?

    - by J.F. Sebastian
    Is there a widely-used algorithm that has time complexity worse than that of another known algorithm but it is a better choice in all practical situations (worse complexity but better otherwise)? An acceptable answer might be in a form: There are algorithms A and B that have O(N**2) and O(N) time complexity correspondingly, but B has such a big constant that it has no advantages over A for inputs less then a number of atoms in the Universe. Examples highlights from the answers: Simplex algorithm -- worst-case is exponential time -- vs. known polynomial-time algorithms for convex optimization problems. A naive median of medians algorithm -- worst-case O(N**2) vs. known O(N) algorithm. Backtracking regex engines -- worst-case exponential vs. O(N) Thompson NFA -based engines. All these examples exploit worst-case vs. average scenarios. Are there examples that do not rely on the difference between the worst case vs. average case scenario? Related: The Rise of ``Worse is Better''. (For the purpose of this question the "Worse is Better" phrase is used in a narrower (namely -- algorithmic time-complexity) sense than in the article) Python's Design Philosophy: The ABC group strived for perfection. For example, they used tree-based data structure algorithms that were proven to be optimal for asymptotically large collections (but were not so great for small collections). This example would be the answer if there were no computers capable of storing these large collections (in other words large is not large enough in this case). Coppersmith–Winograd algorithm for square matrix multiplication is a good example (it is the fastest (2008) but it is inferior to worse algorithms). Any others? From the wikipedia article: "It is not used in practice because it only provides an advantage for matrices so large that they cannot be processed by modern hardware (Robinson 2005)."

    Read the article

  • What is the Worst Depiction of Computer Use in a Movie

    - by Robert Cartaino
    You know the type: "It's a Unix system. I know this" -- in Jurassic park where a computer-genius girl sees a computer and quickly takes over like a 3-D video game, flying through the file system to shut down the park. [video link to the scene] So what's your favorite movie gaff that shows Hollywood can be completely clueless when it comes to portraying technology?

    Read the article

  • Worst Web Site Design Ever

    - by Alex Angas
    I'm looking for a very good example of a very poorly designed web site. For example: use of <blink> mixed with many 'cute' animated GIFs (a common home page in the mid-'90s). It needs to display relatively correctly in the popular web browsers of today. Thank you!

    Read the article

  • Web Security: Worst-Case Situation

    - by Yongho
    I currently have built a system that checks user IP, browser, and a random-string cookie to determine if he is an admin. In the worst case, someone steals my cookie, uses the same browser I do, and masks his IP to appear as mine. Is there another layer of security I should add onto my script to make it more secure?

    Read the article

  • What is the worst code you've ever written?

    - by Even Mien
    Step into the confessional. Now's your time to come clean. What's the worst code you personally have ever written? Why was it so bad? What did you learn from it? Don't tell us about code you inherited or from some co-worker. This is about your personal growth as a programmer and as a person.

    Read the article

  • What is the worst C#/.NET gotcha?

    - by MusiGenesis
    This question is similar to this one, but focused on C# and .NET. I was recently working with a DateTime object, and wrote something like this: DateTime dt = DateTime.Now; dt.AddDays(1); return dt; // still today's date! WTF? The intellisense documentation for AddDays says it adds a day to the date, which it doesn't - it actually returns a date with a day added to it, so you have to write it like: DateTime dt = DateTime.Now; dt = dt.AddDays(1); return dt; // tomorrow's date This one has bitten me a number of times before, so I thought it would be useful to catalog the worst C# gotchas.

    Read the article

  • Search Complexity of a Hashtable within a Hashtable?

    - by spacker_lechuck
    Say we have a hashtable of size m, and at each bucket we store a hashtable of size p. What would the worst case/average case search complexity be? I am inclined to say that since computing a hash function is still atomic, the only worst case scenario is if the value is at the end of the linked list in the hashtable of size p, so O(n)? I have no idea how to calculate the average case for this scenario and would appreciate any pointers!

    Read the article

  • Worst security hole you've seen?

    - by Si
    Subject says it all, probably a good idea to keep details basic to protect the guilty. FWIW, here's a question about what to do if you find a security hole, and another with some useful answers if a company doesn't (seem to) respond.

    Read the article

  • Worst technobabble you've ever heard

    - by pookleblinky
    Following the Egregious pop culture perversion of programming, what is the most outlandishly insane technobabble you have ever heard, either in fiction or real life? Extra points to those unfortunates whose real life stories beat Hollywood. Note: feel free to sketch out what would be necessary for such gibberish to actually work.

    Read the article

  • What is the worst class/variable/function name you have ever encountered

    - by Chris Noe
    Naming things well is arguably Job 1 for professional programmers. Yet we have all suffered from some bad naming choices from time to time. So just to vent a little, what are some doozies that you may have run across? Just to get things started: One of our original developers wasn't sure what to call a secondary key - on what turned out to be a primary table for this app - so he called it: DL2WhateverTheHellThatIs. Unfortunately this system generates entity mappings from the XML, and attributes defined there result in classes, methods, and constants that are referenced through-out the app. To this day it is very hard to find a source file that does not reference this, er, thing! A few actual examples: DL2WhateverTheHellThatIsBean cos = (DL2WhateverTheHellThatIsBean)itr.next(); String code = getDL2WhateverTheHellThatIs().getCode(); From from = new From("DL2WhateverTheHellThatIs"); String filter = "_dL2WhateverTheHellThatIs._code"; (Very difficult to refactor)

    Read the article

  • What is the worst gotcha in WPF?

    - by David
    Hi, I've started to make myself a list of "WPF gotchas": things that bug me and that I had to write down to remember because I fall for them every time.... Now, I'm pretty sure you all stumbled upon similar situations at one point, and I would like you to share your experience on the subject: What is the gotcha that gets you all the time? the one you find the most annoying? (I have a few issues that seem to be without explanation, maybe your submissions will explain them) Here are a few of my "personnal" gotchas (randomly presented): For a MouseEvent to be fired even when the click is on the "transparent" background of a control (e.g. a label) and not just on the content (the Text in this case), the control's Background has to be set to "Brushes.Transparent" and not just "null" (default value for a label) A WPF DataGridCell's DataContext is the RowView to whom the cell belong, not the CellView When inside a ScrollViewer, a Scrollbar is managed by the scrollviewer itself (i.e. setting properties such as ScrollBar.Value is without effect) Key.F10 is not fired when you press "F10", instead you get Key.System and you have to go look for e.SystemKey to get the Key.F10 ... and now you're on.

    Read the article

  • Worst aspect of Python for a newbie

    - by schickb
    I'm wondering specifically what experienced programmers thought when they started developing in Python. I'm sure the answer depends on your background, but my own personal answer is the conversion of basically anything in the language to a True/False value in boolean contexts. Resulting in "oddities" like: if x: not meaning the same thing as: if x == True: I understand why, but it bugs me, and I certainly had to think about it a bit when I first ran into it.

    Read the article

  • Rank Source Control Optionsl-VSS vs CVS vs none vs your own hell

    - by Roman A. Taycher
    It seems like a lit of people here and on many programmer wikis/blogs/ect. elsewhere really dislike VSS. A lot of people also have a serious dislike for cvs. In many places I have heard a lot of differing opinions on whether or not using vss or cvs is better or worse then using no source control, please rate the worst and explain why!!!!! you rated them this way. Feel free to throw in your own horrible system in the rankings. If you feel it depends on the circumstances try to explain the some of the different scenarios which lead to different rankings. (note:I see a lot of discussion of what is better but little of what is worse.) second note: while both answers are nice I'm looking less for good replacements and more for a comparison of which is worse and more importantly why!!!!!

    Read the article

  • Rank Source Control Options-VSS vs CVS vs none vs your own hell

    - by Roman A. Taycher
    It seems like a lot of people here and on many programmer wikis/blogs/ect. elsewhere really dislike VSS. A lot of people also have a serious dislike for cvs. In many places I have heard a lot of differing opinions on whether or not using VSS or cvs is better or worse then using no source control, please rate the worst and explain why!!!!! you rated them this way. Feel free to throw in your own horrible system in the rankings. If you feel it depends on the circumstances try to explain the some of the different scenarios which lead to different rankings. (note:I see a lot of discussion of what is better but little of what is worse.) second note: while both answers are nice I'm looking less for good replacements and more for a comparison of which is worse and more importantly why!

    Read the article

  • "UML is the worst thing to ever happen to MDD." Why?

    - by Florents
    William Cook in a tweet wrote that: "UML is the worst thing to ever happen to MDD. Fortunately many people now realize this ..." I would like to know the reasoning behind that claim (apparently, I'm not referring to his personal opinion). I've noticed that many people out there don't like UML that much. Also it is worth mentioning that he is in academia, where UML is preety much the holy grail of effective design and modelling.

    Read the article

  • What is the worst software bug in history? [closed]

    - by Amir Rezaei
    By having for example money and human suffering as the metric. What is the worst software bug in history? Note this is a specific question. Last month automaker Toyota announced a recall of 160,000 of its Prius hybrid vehicles following reports of vehicle warning lights illuminating for no reason, and cars' gasoline engines stalling unexpectedly. But unlike the large-scale auto recalls of years past, the root of the Prius issue wasn't a hardware problem -- it was a programming error in the smart car's embedded code. The Prius had a software bug.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >