Search Results

Search found 17336 results on 694 pages for 'richard long'.

Page 128/694 | < Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >

  • Thank you South Florida for a successful SPSouthFLA

    - by Leonard Mwangi
    I wanted to officially thank the organizers, speakers, volunteers and the attendees of SharePoint Saturday South Florida. Being the first event in South Florida the reception was phenomenon and the group of speakers from keynote by Joel Oleson to session’s speakers from well renowned speakers like John Holliday, Randy Disgrill, Richard Harbridge, Ameet Phadnis, Fabian Williams, Chris McNulty, Jaime Velez to organizers like Michael Hinckley amongst others. With my Business Intelligence (BI) presentation being on the last track of the day, I spent very quality time networking with these great guys and getting the insider scope on International SharePoint Community from Joel and his son which was mesmerizing. I had a very active audience to a point where we couldn’t accommodate all the contents within the 1hr allocated time because they were very engaged and wanted a deep dive session on news features like PowerPivot, enhancements on PerformancePoint, Excel Services amongst others in order to understand the business value and how SharePoint 2010 is making the self-service BI become a reality. These community events allows the attendees experience technology first hand and network with MVPs, authors, experts providing high quality educational sessions usually for free which is a reason to attend. I have made the slides for my session available for download for those interested http://goo.gl/VaH5x

    Read the article

  • Design pattern for client/server sessions?

    - by nonot1
    Are there any common patterns or general guidance I can learn from for how to design a client/server system where the both the client and server must maintain some kind per-client session state? I've found any number of libraries that can help with some of the plumbing, but it's the overall design I'm wondering about. Open issues in my mind: How to structure the client/server communication so that bidirectional synchronous and asynchronous requests are possible? The server side needs to spawn a couple of per-connected-client session-long helper process. How to manage that? How to manage the mapping from a given client (and any of it's requests) to server state and helper process instances in the face of multiple clients and intermittent network connectivity. Most communication can be simple blocking request/reply, but some will be long running processing tasks that the client will want to keep tabs on. To the extent that it matters, the platform is Linux/C/C++. Not web based. Just an existing thick-client software app being modified to talk to backend servers for some tasks.

    Read the article

  • Slow to sync music files?

    - by pst007x
    I have created a folder in the Ubuntu One sync folder and called it music. I have added various albums and the folder has started to sync. However the files were added back in October and still only the folders have synced and no music files. I tested this service before and added a single music file directly into the Ubuntu One folder (no sub folders) and within a few days it synced, however it seems anything in sub folders seem to stall or take a very long time. My ubuntu One program always says syncing and the progression bar creeps, but still no files synced. I know there are issues with speed but a month and going to sync? I recently tested the same files with Dropbox and it took 9 hours. I have port forwarded the https (443) port both in the software firewall and in my router, I tried disabling both firewalls too, either way it makes no difference. I have also tried both from home and the office on different Ubuntu systems. Is there anything anyone has done to improve this service? I am trying to integrate Ubuntu One service into the office to share project files but the syncing is taking to long. I am using the latest Ubuntu 10.10 (fully updated, fresh install), I love Ubuntu and wish to continue to support it anyway I can, so a solution would be good :-) Any help would be appreciated. Thanks Paul

    Read the article

  • Updating physics for animated models

    - by Mathias Hölzl
    For a new game we have do set up a scene with a minimum of 30 bone animated models.(shooter) The problem is that the update process for the animated models takes too long. Thats what I do: Each character has ~30 bones and for every update tick the animation gets calculated and every bone fires a event with the new matrix. The physics receives the event with the new matrix and updates the collision shape for that bone. The time that it takes to build the animation isn't that bad (0.2ms for 30 Bones - 6ms for 30 models). But the main problem is that the physic engine (Bullet) uses a diffrent matrix for transformation and so its necessary to convert it. Code for matrix conversion: (~0.005ms) btTransform CLEAR_PHYSICS_API Mat_to_btTransform( Mat mat ) { btMatrix3x3 bulletRotation; btVector3 bulletPosition; XMFLOAT4X4 matData = mat.GetStorage(); // copy rotation matrix for ( int row=0; row<3; ++row ) for ( int column=0; column<3; ++column ) bulletRotation[row][column] = matData.m[column][row]; for ( int column=0; column<3; ++column ) bulletPosition[column] = matData.m[3][column]; return btTransform( bulletRotation, bulletPosition ); } The function for updating the transform(Physic): void CLEAR_PHYSICS_API BulletPhysics::VKinematicMove(Mat mat, ActorId aid) { if ( btRigidBody * const body = FindActorBody( aid ) ) { btTransform tmp = Mat_to_btTransform( mat ); body->setWorldTransform( tmp ); } } The real problem is the function FindActorBody(id): ActorIDToBulletActorMap::const_iterator found = m_actorBodies.find( id ); if ( found != m_actorBodies.end() ) return found->second; All physic actors are stored in m_actorBodies and thats why the updating process takes to long. But I have no idea how I could avoid this. Friendly greedings, Mathias

    Read the article

  • Monitoring remote laptops

    - by kaerast
    We're looking for something to monitor around 30 remote laptops that are constantly out on the road, never returning to base except for when there are serious hardware faults that need repairing. These laptops won't always be connected to the internet, they'll have mobile broadband and may work offline most of the time. They will be running a mixture of Windows XP, Vista and 7 and there is currently no server setup. We're primarily interested in making sure that Windows Updates and antivirus updates are happening, and I guess we should also be monitoring remaining disk space, what software is installed and ideally hardware health. It might also be nice if we could gain remote access to perform work on them. My main reason for wanting to monitor them is that it's going to be a real pain to get them back to base if anything goes wrong, so I want to be proactive in ensuring they last as long as possible. Can you recommend what I should be monitoring to ensure a long life? What tools would you use to monitor and maintain these computers?

    Read the article

  • Why maven so slow compared to automake?

    - by ???'Lenik
    I have a Maven project consists of around 100 modules. I have reason to decompose the project to so many modules, and I don't think I should merge them in order to speed up the build process. I have read a lot of projects by other people, e.g., the Maven project itself, and Apache Archiva, and Hudson project, they all consists of a lot of modules, nearly 100 maybe, more or less. The problem is, to build them all need so much time, 3 hours for the first time build (this is acceptable because a lot of artifacts to download), and 15 minutes for the second build (this is not acceptable). For automake, things are similarly, the first time you need to configure the project, to prepare the magical config.h file, it's far more complex then what maven does. But it's still fast, maybe 10 seconds on my Debian box. After then, make install requires maybe 10 minutes for the first time build. However, when everything get prepared, the .o object files are generated, they don't have to be rebuild at all for the second time build. (In Maven, everything rebuild at everytime.) I'm very wondering, how guys working for Maven projects can bare this long time for each build, I'm just can't sit down calmly during each time Maven build, it took too long time, really.

    Read the article

  • Problems when trying to connect to a router wirelessly

    - by Ruud Lenders
    The situation - At my girlfriend's parents' place there are six Windows 7 devices that are wired or wireless connected to a router: 3 dekstops and 3 laptops. There are also several smartphones using the router. The router is secured with WPA2 (AES). The problem - We never had any problems with the router for over a year. But recently - about 3 weeks ago - my girlfriend's laptop (HP) and my laptop (ASUS) started to develop problems while trying to connect to the router. The router has stopped showing up from the network list. Sometimes it comes back and shows up, but then it keeps saying something along the lines of "Could not connect", and not long after that it dissapears again. The range of the router is not the problem here, because we experience the same when we sit next to the router. Sometimes, if we are lucky, and waited a long time (10-15 minutes) without using the laptop for anything, the laptop will eventually succesful connect to the router. The attempts - Of course, the Window 7 troubleshooter. We tried troubleshooting the connection problems and the wireless network adapter, but no luck. We also reset the router enough times to know that's not helping either. Here's the full list of things we tried, but did not help: Running the Windows 7 troubleshooter Resetting the router (more than once) Setting the router settings to factory defaults Disconnecting all other devices except one laptop Applying a system restore Trying static/dynamic IP/DNS - Dynamic is better, right? Enabling/disabling IPv6 - Should I keep IPv6 disabled? Running the command: netsh wlan stop hostednetwork Running the command: netsh wlan set hostednetwork mode=disallow Updating/reïnstalling wireless adapter drivers The tests - To help finding the core of the problem, we tested the following: Plugging an ethernet cable in the router and in our laptops - worked fine Connecting someone else's laptop to the router (wireless) - worked fine Connecting our laptops to someone else's router - worked fine The router - This information might be relevant: Router model: Sitecom 300N Wireless Router Router hardware: version 01 The DCHP Server's IPs range from 192.168.0.100 to 192.168.0.200. Router settings: Wireless channel: 12 Channel bandwidth: 20/40 MHz Extension channel: 8 Preamble type: Long 802.11g protection: Disabled UPnP: Enabled The laptops - If you are wondering about our laptops: My laptop model: ASUS Pro64JQ Girlfriend's laptop: HP Pavillion G6 OS: Both Windows 7 Professional x64 - with Service Pack 1 My wireless adapter: Atheros AR9285 AdHoc 11n: Enabled The question - Does anyone have experienced the same problems as I do? Or does someone know how to solve this? Are there more tricks I can try, or settings I should change? Note - Our laptops are not slow or old. My laptop is 1.5 years old, and the other laptop is just 5 months old. I know how to keep laptops clean and I'm pretty sure both laptops are not bloated with useless software.

    Read the article

  • What lasts longer: Data stored on non-volatile flash RAM, optical media, or magnetic disk?

    - by Chris W. Rea
    What lasts longer: Data stored on non-volatile flash RAM (USB stick or SD cards?), optical media (CD, DVD, or Blu-Ray?), or magnetic disk (floppies, hard drives?) My gut tells me optical media, but I'm not sure. Furthermore, which of those digital media would be most suitable for long-term data storage where environmental issues are unknown, such as low/high temperature or humidity? For example, what digital media could be stored in a basement, attic, or time capsule, and be expected to survive a reasonably long time? e.g. a lifetime, and then some. Update: Looks like optical media and magnetic tape each have one vote below. Does anybody else have an opinion or know of a study comparing the two?

    Read the article

  • How can I be sure that the motherboard is dead and it's not another issue?

    - by Peter
    So I have and old computer with and Intel D101GGC motherboard and a pentium 4 cpu and award bios, with a long repeating beep and not passing POST. I tried: RAM checking, they are good and working on other computers Replacing the PSU with a tested one, and still having the same issue I assumed that the CPU is dead and tested 2 others they are both Celeron D processors I also tested the CPU of this computer on another one and it's working fine I also tried it without RAM and still have the same long beep I did all this after disconnecting all the other hardware like HDD and DVD drives. The only time I didn't get any beeps is after removing the CPU I don't know if the processor is needed for the beeps to work. So My questions are: is it normal to not get any beeps if we don't have a CPU installed ? am I missing something or is the motherboard dead ? and thanks in advance.

    Read the article

  • Speed up loading of test results from builds in Visual Studio

    - by Jakob Ehn
    I still see people complaining about the long time it takes to load test results from a TFS build in Visual Studio. And they make a valid point, it does take a very long time to load the test results, even for a small number of tests. The reason for this is that the test results is not just the result of the test run but also all the binaries that were part of the test run. This often also means that the debug symbols (*.pdb) will be downloaded to your local machine. This reason for this behaviour is that it letsyou re-run the tests locally. However, most of the times this is not what the developer will do, they just want to know which tests failed and why. They can then fix the tests and rerun them locally. It turns out there is a way to load only the test results, which is much faster. The only tricky bit is to find the location of the .trx file that is generated during the build. Particularly in TFS 2010 where you often have multiple build agents, which of corse results in different paths to the trx file. Note: To use this you must have read permission to the build folder on the build agent where the build was executed. Open the build result for the build Click View Log Locate the part where MSTest is invoked. When using test containers, it looks like this:   Note: You can actually search in the log window, press Ctrl+F and you will get a little search box at the bottom. Nice! On the MSTest command line call, locate the /resultsfileroot parameter, which points to the folder where the test results are stored Note that this path is local for the build server, so you need to replace the drive letter with the server name: D:\Builds\Project\TestResults to \Project\TestResults">\\<BuildServer>\Project\TestResults Double-click on the .trx file and you will notice that it loads much faster compared to opening it from the build log window

    Read the article

  • IT Optimization Plan Pays Off For UK Retailer

    - by [email protected]
    I caught this article in ComputerworldUK yesterday. The headline talks about UK-based supermarket chain Morrisons is increasing their IT spend...OK, sounds good. Even nicer that Oracle is a big part of that. But what caught my eye were three things: 1) Morrison's truly has a long term strategy for IT. In this case, modernizing and optimizing how they use IT for business advantage. 2) Even in a tough economic climate, Morrison's views IT investments as contributing to and improving the bottom line. Specifically, "The investment in IT contributed to a 21 percent increase in Morrison's underlying profit.." 3) The phased, 3-year "Optimization Plan" took a holistic approach to their business--from CRM and Supply Chain systems to the underlying application infrastructure. On the infrastructure front, adopting a more flexible Service-Oriented Architecture enabled them to be more agile and adapt their business and Identity Management helped with sometimes mundane (but costly) issues like lost passwords and being able to document who has access to what. Things don't always turn out so rosy. And I know it was a long and difficult process...but it's nice to see a happy ending every once in a while.

    Read the article

  • J2EE or .Net Framework [closed]

    - by Kevino
    I want to learn JAVA or C#... tell me the strength and weakness of each platforms J2EE and .Net Framework today in 2012 and which is safer for the future jobs wise? I tend to prefer Java because here (Montreal, Toronto) there is like 6 Java jobs for each C# jobs and some experienced programmers advised me to go with Java because they say JVM languages are winning in the cloud and the rise of Android can't do anything except help Java in the long run. Is that true today with the release of windows 8 soon and ios devices? On the other side 1 of these programmers told me that corporation love Asp.Net Mvc3 for intranet and web dev and that tomcat/apache java jsp adoption is slowing down compared to Asp.net and ruby on rails & html5 etc. He told me too since I have a good background in system admins & networking C# would be better for me because I'll be able to do more things in the microsoft world with powershell automation and creating my own apps for all the networking stuffs (windows server, dns,dhcp, active directory, sharepoint etc). But what if windows 8 flop java and android aren't safer in the long run? because he told me mono was a joke compared to Java/android or native objective-c on ios devices. (I plan to do a full time study of 10hr's / 15hr's a day for the next 9 months of either Java or C# that's why I ask this)

    Read the article

  • Keyboard freezes / stuck if a key pressed repeatedly

    - by Aziz Rahmad
    I use Acer 4530. This problem has happened long since Ubuntu 10.10 and now that I use 11.04 dual booted with Linux Mint 10. Everytime I press one key repeatedly, like when I read a long article in a website/ebook or when I play games which required me to press arrow keys repeatedly, it would randomly freeze. That is, whatever I press on keyboard has no effect, and that also happens with touchpad. However, the USB mouse works just fine. I later found out that it's not actually freeze but more like it's like the key stuck. For example when I play tetris which I usually press w (down) button repeatedly, after some times it would freeze. And if I put the cursor in, say, browser's address bar, it would type "wwww....." infinitely. The only way I could fix it is by suspend the laptop, either by using mouse or by closing the lid. And instead of suspended, in that case the laptop would automatically wake up and everything will be fine. (Usually my laptop would wake up after suspended by pressing any key) It has happened since the first time I use Ubuntu, 10.10, and it also happens in Linux Mint 10, and until now in Ubuntu 11.04. It never happened when I use Windows, though. Anyone has ever encounter similar problem? Anyone know how to fix it permanently? UPDATE I just recently installed Windows 7 and Windows 8 Development Preview and both have similar symptoms. So I declare that this problem is not OS specific, probably hardware problem.

    Read the article

  • how insecure is my short password really?

    - by rika-uehara
    Using systems like TrueCrypt, when I have to define a new password I am often informed that using a short password is insecure and "very easy" to break by brute-force. I always use passwords of 8 characters in length, which are not based on dictionary words, which consists of characters from the set A-Z, a-z, 0-9 I.e. I use password like sDvE98f1 How easy is it to crack such a password by brute-force? I.e. how fast. I know it heavily depends on the hardware but maybe someone could give me an estimate how long it would take to do this on a dual core with 2GHZ or whatever to have a frame of reference for the hardware. To briute-force attack such a password one needs not only to cycle through all combinations but also try to de-crypt with each guessed password which also needs some time. Also, is there some software to brute-force hack truecrypt because I want to try to brute-force crack my own passsword to see how long it takes if it is really that "very easy".

    Read the article

  • Get to Know a Candidate (19-25 of 25): Independent Candidates

    - by Brian Lanham
    DISCLAIMER: This is not a post about “Romney” or “Obama”. This is not a post for whom I am voting.  Information sourced for Wikipedia. The following independent candidates have gained access to at least one state ballot. Richard Duncan, of Ohio; Vice-presidential nominee: Ricky Johnson Candidate Ballot Access: Ohio - (18 Electoral)  Write-In Candidate Access: Alaska, Florida, Indiana, Maryland Randall Terry, of West Virginia; Vice-presidential nominee: Missy Smith Candidate Ballot Access: Kentucky, Nebraska, West Virginia - (18 Electoral)  Write-In Candidate Access: Colorado, Indiana Sheila Tittle, of Texas; Vice-presidential nominee: Matthew Turner Candidate Ballot Access: Colorado, Louisiana - (17 Electoral) Jeff Boss, of New Jersey; Vice-presidential nominee: Bob Pasternak Candidate Ballot Access: New Jersey - (14 Electoral) Dean Morstad, of Minnesota; Vice-presidential nominee: Josh Franke-Hyland Candidate Ballot Access: Minnesota - (10 Electoral)  Write-In Candidate Access: Utah Jill Reed, of Wyoming; Vice-presidential nominee: Tom Cary Candidate Ballot Access: Colorado - (9 Electoral)  Write-In Candidate Access: Indiana, Florida Jerry Litzel, of Iowa; Vice-presidential nominee: Jim Litzel Candidate Ballot Access: Iowa - (6 Electoral) That wraps it up people. We have reviewed 25 presidential candidates in the 2012 U.S. election. Look for more blog posts about the election to come.

    Read the article

  • What expectation should I have of South African web development rates / duration? [closed]

    - by Warren van Rooyen
    I am a developer but only intend on doing the front-end work for getting Reddit-like upvote / downvote functionality going on an upcoming site I'm building. I have never had to contract a developer for back-end work to implement code for me so I am quite in the dark on how much I should expect to pay and how long it could take to get the site going. I could be taken for a ride as the developer could distort the time it would take at a seemingly regular rate (hourly/day) or could otherwise distort their rate. Please could you give me help on this. I know you need some guidance on the nature of the site so here it is. I have a Reddit type template with CSS and PHP included. I then downloaded Pligg code that's intended to do the job of the Reddit upvote downvote functionality. How long would a developer roughly need to unite the theme and front end with the back-end functionality? I do understand it's not a lot of info but I'm sure you're experienced enough to have an instinct for the size of the project. Also, should I work on an hourly/day rate/ project payment agreement?

    Read the article

  • JavaOne 2012 Conference Preview

    - by Janice J. Heiss
    A new article, by noted freelancer Steve Meloan, now up on otn/java, titled “JavaOne 2012 Conference Preview,” looks ahead to the fast approaching JavaOne 2012 Conference, scheduled for September 30-October 4 in San Francisco. The Conference will celebrate and highlight one of the world’s leading technologies. As Meloan states, “With 9 million Java developers worldwide, 5 billion Java cards in use, 3 billion mobile phones running Java, 1 billion Java downloads each year, and 100 percent of Blu-ray disk players and 97 percent of enterprise desktops running Java, Java is a technology that literally permeates our world.”The 2012 JavaOne is organized under seven technical tracks:* Core Java Platform* Development Tools and Techniques* Emerging Languages on the JVM* Enterprise Service Architectures and the Cloud* Java EE Web Profile and Platform Technologies* Java ME, Java Card, Embedded, and Devices* JavaFX and Rich User ExperiencesConference keynotes will lay out the Java roadmap. For the Sunday keynote, such Oracle luminaries as Cameron Purdy, Vice President of Development; Nandini Ramani, Vice President of Engineering, Java Client and Mobile Platforms; Richard Bair, Chief Architect, Client Java Platform; and Mark Reinhold, Chief Architect, Java Platform will be presenting.For the Thursday IBM keynote, Jason McGee, Distinguished Engineer and Chief Architect for IBM PureApplication System, and John Duimovich, Java CTO and IBM Distinguished Engineer, will explore Java and IBM's cloud-based initiatives.All in all, the JavaOne 2012 Conference should be as exciting as ever.Link to the article here. Originally published on blogs.oracle.com/javaone.

    Read the article

  • JavaOne 2012 Conference Preview

    - by Janice J. Heiss
    A new article, by noted freelancer Steve Meloan, now up on otn/java, titled “JavaOne 2012 Conference Preview,” looks ahead to the fast approaching JavaOne 2012 Conference, scheduled for September 30-October 4 in San Francisco. The Conference will celebrate and highlight one of the world’s leading technologies. As Meloan states, “With 9 million Java developers worldwide, 5 billion Java cards in use, 3 billion mobile phones running Java, 1 billion Java downloads each year, and 100 percent of Blu-ray disk players and 97 percent of enterprise desktops running Java, Java is a technology that literally permeates our world.” The 2012 JavaOne is organized under seven technical tracks: * Core Java Platform* Development Tools and Techniques* Emerging Languages on the JVM* Enterprise Service Architectures and the Cloud* Java EE Web Profile and Platform Technologies* Java ME, Java Card, Embedded, and Devices* JavaFX and Rich User Experiences Conference keynotes will lay out the Java roadmap. For the Sunday keynote, such Oracle luminaries as Cameron Purdy, Vice President of Development; Nandini Ramani, Vice President of Engineering, Java Client and Mobile Platforms; Richard Bair, Chief Architect, Client Java Platform; and Mark Reinhold, Chief Architect, Java Platform will be presenting. For the Thursday IBM keynote, Jason McGee, Distinguished Engineer and Chief Architect for IBM PureApplication System, and John Duimovich, Java CTO and IBM Distinguished Engineer, will explore Java and IBM's cloud-based initiatives.All in all, the JavaOne 2012 Conference should be as exciting as ever. Link to the article here.

    Read the article

  • 50% off ASP.NET hosting

    - by Fabrice Marguerie
    I haven't blogged for a long time because I'm busy working on an exciting new project. It's too early to tell you more. I'll provide details in a few months. Meanwhile, I wanted to write a quick post to share an excellent offer with you. It's that time of the year when you can get deals on many things, including web hosting. I'd like to remind you about Arvixe, a great web hosting provider for Windows (for ASP.NET) and Linux. For 48 hours, between Thursday November 24th at 00:00 PST (08:00 GMT/UTC) and Friday November 25th, Arvixe will be offering 50% off all of their shared hosting products. This will be for all new accounts, for life (as long as you continue to renew the account)!I've been using Arvixe for my websites for more than one year and a half now, and I highly recommend them. Here is an overview of what I get for a very good price:Unlimited diskspaceUnlimited data transferUnlimited domainsUnlimited POP3 and IMAP mailboxesUnlimited SQL Server 2008 databasesUnlimited MySQL databases.NET 1.1, 2, 3.5 and 4Dedicated application poolsFull trustIIS 7Daily backupsand more... And now, you can get that too for half the price. Just go to Arvixe.com and secure your own hosting account now by using the coupon code "Black Friday" during checkout.Disclaimer: the links to Arvixe are affiliate links that may bring me some money home if you sign up. Still, I recommend Arvixe because I use them and I'm very happy with what they offer.

    Read the article

  • Why do computers get slower over time? [closed]

    - by Paperflyer
    Possible Duplicate: Why does hardware get slower with time? You probably know this: A newly bought computer is snappy and responsive and just really fast. Then you use it for a couple of months and slowly but steadily the computer gets slower. Opening programs now takes a long time, accessing files takes longer, everything just takes longer than it used to. If you wipe your hard drive and reinstall, everything is back to its original snappyness, but will deteriorate again. This always happend with any operating system I used. Worst of all Windows XP, but also with Ubuntu Linux, Fedora Linux, OSX 10.5/10.6, Windows Vista... (haven't used Win 7 long enough to confirm this) Do you know the reason for this? Or even, a cure?

    Read the article

  • Penne alla MVP

    - by Valter Minute
    I’m sorry for the long silence on this blog and the long delay in replying to the friends that commented on my articles. I’ve been quite busy in the last weeks and I spent a lot of time traveling around Italy (not for pleasure!). In the meantime I’ve been renewed as an MVP on April the 1st (nice date to renew someone with such a bad sense of humor…). I decided to celebrate my MVP award with a new recipe (to be honest, I celebrated by eating the results of this recipe!) and I decided to call it “penne alla MVP”… just because I’m not good in finding nice names for my recipes. Ingredients (for 4 people): 360g pasta (penne or other short pasta) 300g small shrimps 1 cup of whipped cream 2 tablespoons of olive oil 1 small leek 1 glass of beer (I used Hoegaarden dutch white beer… but just because I like it and I finished the rest of the bootle while cooking) Chives Salt, pepper Prepare the pasta by boiling it in salted water, as usual. In the meantime chop the leek in very small bits, heat the oil inside a pan and when the oil is hot, drop the leek chops and let them cook for a few minutes. Add the shrimps and the glass of beer. Let them cook inside beer until they are cooked (if you used pre-cooked shrimps a couple of minutes would be enough to heat them and gave them the flavour of beer). Add the whipped cream and mix it well with the shrimps and the sauce. Dry the pasta and drop the sauce on top of it and then add the chives finely chopped.

    Read the article

  • Software development is (mostly) a trade, and what to do about it

    - by Jeff
    (This is another cross-post from my personal blog. I don’t even remember when I first started to write it, but I feel like my opinion is well enough baked to share.) I've been sitting on this for a long time, particularly as my opinion has changed dramatically over the last few years. That I've encountered more crappy code than maintainable, quality code in my career as a software developer only reinforces what I'm about to say. Software development is just a trade for most, and not a huge academic endeavor. For those of you with computer science degrees readying your pitchforks and collecting your algorithm interview questions, let me explain. This is not an assault on your way of life, and if you've been around, you know I'm right about the quality problem. You also know the HR problem is very real, or we wouldn't be paying top dollar for mediocre developers and importing people from all over the world to fill the jobs we can't fill. I'm going to try and outline what I see as some of the problems, and hopefully offer my views on how to address them. The recruiting problem I think a lot of companies are doing it wrong. Over the years, I've had two kinds of interview experiences. The first, and right, kind of experience involves talking about real life achievements, followed by some variation on white boarding in pseudo-code, drafting some basic system architecture, or even sitting down at a comprooder and pecking out some basic code to tackle a real problem. I can honestly say that I've had a job offer for every interview like this, save for one, because the task was to debug something and they didn't like me asking where to look ("everyone else in the company died in a plane crash"). The other interview experience, the wrong one, involves the classic torture test designed to make the candidate feel stupid and do things they never have, and never will do in their job. First they will question you about obscure academic material you've never seen, or don't care to remember. Then they'll ask you to white board some ridiculous algorithm involving prime numbers or some kind of string manipulation no one would ever do. In fact, if you had to do something like this, you'd Google for a solution instead of waste time on a solved problem. Some will tell you that the academic gauntlet interview is useful to see how people respond to pressure, how they engage in complex logic, etc. That might be true, unless of course you have someone who brushed up on the solutions to the silly puzzles, and they're playing you. But here's the real reason why the second experience is wrong: You're evaluating for things that aren't the job. These might have been useful tactics when you had to hire people to write machine language or C++, but in a world dominated by managed code in C#, or Java, people aren't managing memory or trying to be smarter than the compilers. They're using well known design patterns and techniques to deliver software. More to the point, these puzzle gauntlets don't evaluate things that really matter. They don't get into code design, issues of loose coupling and testability, knowledge of the basics around HTTP, or anything else that relates to building supportable and maintainable software. The first situation, involving real life problems, gives you an immediate idea of how the candidate will work out. One of my favorite experiences as an interviewee was with a guy who literally brought his work from that day and asked me how to deal with his problem. I had to demonstrate how I would design a class, make sure the unit testing coverage was solid, etc. I worked at that company for two years. So stop looking for algorithm puzzle crunchers, because a guy who can crush a Fibonacci sequence might also be a guy who writes a class with 5,000 lines of untestable code. Fashion your interview process on ways to reveal a developer who can write supportable and maintainable code. I would even go so far as to let them use the Google. If they want to cut-and-paste code, pass on them, but if they're looking for context or straight class references, hire them, because they're going to be life-long learners. The contractor problem I doubt anyone has ever worked in a place where contractors weren't used. The use of contractors seems like an obvious way to control costs. You can hire someone for just as long as you need them and then let them go. You can even give them the work that no one else wants to do. In practice, most places I've worked have retained and budgeted for the contractor year-round, meaning that the $90+ per hour they're paying (of which half goes to the person) would have been better spent on a full-time person with a $100k salary and benefits. But it's not even the cost that is an issue. It's the quality of work delivered. The accountability of a contractor is totally transient. They only need to deliver for as long as you keep them around, and chances are they'll never again touch the code. There's no incentive for them to get things right, there's little incentive to understand your system or learn anything. At the risk of making an unfair generalization, craftsmanship doesn't matter to most contractors. The education problem I don't know what they teach in college CS courses. I've believed for most of my adult life that a college degree was an essential part of being successful. Of course I would hold that bias, since I did it, and have the paper to show for it in a box somewhere in the basement. My first clue that maybe this wasn't a fully qualified opinion comes from the fact that I double-majored in journalism and radio/TV, not computer science. Eventually I worked with people who skipped college entirely, many of them at Microsoft. Then I worked with people who had a masters degree who sucked at writing code, next to the high school diploma types that rock it every day. I still think there's a lot to be said for the social development of someone who has the on-campus experience, but for software developers, college might not matter. As I mentioned before, most of us are not writing compilers, and we never will. It's actually surprising to find how many people are self-taught in the art of software development, and that should reveal some interesting truths about how we learn. The first truth is that we learn largely out of necessity. There's something that we want to achieve, so we do what I call just-in-time learning to meet those goals. We acquire knowledge when we need it. So what about the gaps in our knowledge? That's where the most valuable education occurs, via our mentors. They're the people we work next to and the people who write blogs. They are critical to our professional development. They don't need to be an encyclopedia of jargon, but they understand the craft. Even at this stage of my career, I probably can't tell you what SOLID stands for, but you can bet that I practice the principles behind that acronym every day. That comes from experience, augmented by my peers. I'm hell bent on passing that experience to others. Process issues If you're a manager type and don't do much in the way of writing code these days (shame on you for not messing around at least), then your job is to isolate your tradespeople from nonsense, while bringing your business into the realm of modern software development. That doesn't mean you slap up a white board with sticky notes and start calling yourself agile, it means getting all of your stakeholders to understand that frequent delivery of quality software is the best way to deal with change and evolving expectations. It also means that you have to play technical overlord to make sure the education and quality issues are dealt with. That's why I make the crack about sticky notes, because without the right technique being practiced among your code monkeys, you're just a guy with sticky notes. You're asking your business to accept frequent and iterative delivery, now make sure that the folks writing the code can handle the same thing. This means unit testing, the right instrumentation, integration tests, automated builds and deployments... all of the stuff that makes it easy to see when change breaks stuff. The prognosis I strongly believe that education is the most important part of what we do. I'm encouraged by things like The Starter League, and it's the kind of thing I'd love to see more of. I would go as far as to say I'd love to start something like this internally at an existing company. Most of all though, I can't emphasize enough how important it is that we mentor each other and share our knowledge. If you have people on your staff who don't want to learn, fire them. Seriously, get rid of them. A few months working with someone really good, who understands the craftsmanship required to build supportable and maintainable code, will change that person forever and increase their value immeasurably.

    Read the article

  • Are there any free voice transcription software?

    - by netvope
    I have some 1-hour-long voice recordings containing useful information that I may need to look up in the future. Instead of transcribing them myself (which will take me many hours), I want to automate it using software. I don't need an accurate transcription; it's OK as long as I can get an idea of what was being talked about by skimming over the transcription. With this, I can quickly figure out which part of the audio contains the information I need, and is much more convenient than seeking randomly. In theory I could write a program to split the audio into 3 minutes chucks and pipe them into Google Voice's free voicemail transcription service... But I hope there are better solutions. Do you know any free voice transcription software? Note: Free trials are also acceptable

    Read the article

  • Devoxx 2011 Started Today

    - by Yolande
    Devoxx 2011, organized by Java user group in Belgium, is the biggest Java conference in Europe. The first two University Days set the tone for the weeklong conference with its in-depth technical sessions lead by luminaries from the Java community and industry experts. Each day is a great mix of 3 hour sessions and hands-on labs, 30 minute Tools-in-Action sessions giving tips for faster and better application development and the traditional Birds-of-a-Feather sessions in the evening. Java sessions for today and tomorrow: - Next Gen Enterprise Apps - Bert Ertman and Paul Bakker talked about new Java EE 6 APIs that reduces the need for boilerplate code and configuration. - JavaFX 2.0 – A Java developer’s guide - Stephen Chin and Peter Pilgrim will give an overview of new version and how Java developers can take advantage of it - Java Rich Clients with JavaFX 2.0 - Richard Bair and Jasper Potts will get into JavaFX 2.0 APIs - Building an end-to-end application using Java EE 6 and NetBeans - Arun Gupta will showcase how to write Java EE 6 applications more effectively. - The OpenJDK Community BOF with Dalibor Topic Starting Tuesday, come by the Oracle booth to chat about technology, enter our raffle and have a beer every day at 18:45 The sessions will be available on Parleys website after the conference. In the meantime, you can learn a lot about those Java technologies on our website: - JavaFX 2.0 tutorials and documentation - OpenJDK - News from the GlassFish community - JavaEE 6 resources - JavaOne sessions

    Read the article

< Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >