Search Results

Search found 20281 results on 812 pages for 'software engineer'.

Page 558/812 | < Previous Page | 554 555 556 557 558 559 560 561 562 563 564 565  | Next Page >

  • FreeNX Server w/ nxagent 3.5 not able to create shadow sessions

    - by Jenna Whitehouse
    I am running a FreeNX server on Ubuntu 11.10 and am unable to do session shadowing. I get the authorization prompt, but the shadow client crashes after. The NX server log in the user's .nx directory is as follows: Error: Aborting session with 'Server is already active for display 3000 If this server is no longer running, remove /tmp/.X3000-lock and start again'. Session: Aborting session at 'Mon Oct 1 14:26:44 2012'. Session: Session aborted at 'Mon Oct 1 14:26:44 2012'. This then deletes the lock file, which is the lock file for the initial Unix session and crashes out. Everything works for a normal session, and shadowing works up to the authorization prompt. I am using this software: Ubuntu 11.10 freenx-server 0.7.3.zgit.120322.977c28d-0~ppa11 nx-common 0.7.3.zgit.120322.977c28d-0~ppa11 nxagent 1:3.5.0-1-2-0ubuntu1ppa8 nxlibs 1:3.5.0-1-2-0ubuntu1ppa8 Any help is appreciated, thanks!

    Read the article

  • Can't access some websites with any browser

    - by Charles Kingsmill
    I'm running Windows 7 64-bit on a new Samsung laptop and accessing the internet okay via ethernet cable to my university's ISP. Some sites work fine (e.g. google.com) but I can't access others at all (microsoft.com, topshop.com). I can't connect to those sites in safe mode with networking. And ping and tracert both fail. There's no proxy. Other users can connect successfully to these sites using my cable and socket. I've tried all the following with no success: using various browsers (IE9, FF, Chrome) creating a new user updating drivers clearing the DNS cache using OpenDNS and Google's DNS turning off Avast tweaking the MTU running MS malicious software removal tool running Spybot S&D reviewing the hosts file disabling the IPv6 options repairing / resetting winsock settings disabling advanced javascript options I have run out of ideas... can anyone see anything I've missed??!

    Read the article

  • What is the ideal length of a method?

    - by iPhoneDeveloper
    In object-oriented programming, there is no exact rule on the maximum length of a method , but I still found these two qutes somewhat contradicting each other, so I would like to hear what you think. In Clean Code: A Handbook of Agile Software Craftsmanship, Robert Martin says: The first rule of functions is that they should be small. The second rule of functions is that they should be smaller than that. Functions should not be 100 lines long. Functions should hardly ever be 20 lines long. and he gives an example from Java code he sees from Kent Beck: Every function in his program was just two, or three, or four lines long. Each was transparently obvious. Each told a story. And each led you to the next in a compelling order. That’s how short your functions should be! This sounds great, but on the other hand, in Code Complete, Steve McConnell says something very different: The routine should be allowed to grow organically up to 100-200 lines, decades of evidence say that routines of such length no more error prone then shorter routines. And he gives a reference to a study that says routines 65 lines or long are cheaper to develop. So while there are diverging opinions about the matter, is there a functional best-practice towards determining the ideal length of a method for you?

    Read the article

  • Microsoft Researchers shows off best Touch Screen ever made. Better than Apple touch screens!

    - by Gopinath
    All the touch devices we have in market today like iPads, iPhones, Samsung tablets and phones, etc.  have a very small issue – 100 milliseconds of lag. The lag is the amount of time a touch device takes to respond after you touch the device. The 100 milliseconds of lag may not be an issue when you are tapping and swapping the interface elements on a device, but they are apparent when you wing your finger around the screen faster. For example if you use any painting app, the lag is very obvious and screen responds slowly than an artist can paint with his finger. Researchers at Microsoft labs came out with a prototype of touch device that drastically cuts down the 100 milliseconds of lag time to just 1 millisecond. That’s 100 times faster than today’s touch screen devices. Check out the video embedded below for a demo of new touch screen. Over at TechCrunch, Chris Velazco says: The difference is staggering, especially when Dietz trots out the slow-motion footage. With the delay between touch input and screen response slashed by orders of magnitude, a device that sports the sort of super-low-latency Dietz envisions has the potential to feel far more (for lack of a better term) natural than its brethren. There’s zero delay when you slide a checker across a board, for example, and bringing that sort of instantaneous feedback to the many screens in our lives could help to bridge the gap between operating a bit of software and the feeling of interacting with objects.   It will be great boost to Microsoft’s tablet strategy if they succeed in bringing this research into mass market and allow it’s partners to use the technology on Windows 8 tablets.

    Read the article

  • Tool to check if XML is valid in my VS2012 comments

    - by davidjr
    I am writing the documentation for our companies software developed with vs2012. I need to add xml examples to the summary of each class, due to xml instantiation of objects. We are using sandcastle to create the documentation (company choice), and I want to be able to review my xml comments without building the help file every time. Is there an application that anyone would recommend where I can view how the xml renders before I build the help file? Here is my example: /// <summary> /// Performs DFT on a data array, writes output in a CSV file. /// </summary> /// <example> /// <para>XML declaration</para> /// <code lang="xml" xml:space="preserve"> /// %lt;DataProvider name="DftDP" description="Computes DFT" etc... I want to check the XML to make sure it is valid, maybe by copy and pasting it into a tool of some sort?

    Read the article

  • Friday Fun: Building Blasters 2

    - by Mysticgeek
    After dealing with unnecessary spreadsheets and TPS reports all week, it’s time to waste time playing a flash game. Today we take a look at Building Blasters 2 where you strategically place explosives to bring down structures. Building Blasters 2 You need to place explosives carefully to clear areas in the red level, keep bystanders safe, and manage your budget. After placing the explosives on the structure, you can set the amount of time that passes before they blow. This comes in handy when you reach advanced levels. When you’re ready to start the demolition click on the Detonate button and watch the buildings fall. If you don’t achieve the objectives, you will get the Demolition Error screen and can replay the level. After you’ve received enough money, you’ll get a message between missions telling you there is enough money to buy items in the shop. You can get enhanced destructive devices such as nitroglycerin, a wrecking ball, call in an air strike and more… If you’re sick of the pointy haired boss dragging you down all week, pretend the structures are the office building and destroy away. Building Blasters 2 is a great way to have fun and let off steam so you can enjoy your weekend. Play Building Blasters 2 For additional fun games to play, make sure and check out the How-To Geek Arcade. Similar Articles Productive Geek Tips Friday Fun: Demolition CityFriday Fun: Cargo BridgeFriday Fun: Portal, the Flash VersionFriday Fun: VehiclesFriday Fun: Play Bubble Quod TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 10 Superb Firefox Wallpapers OpenDNS Guide Google TV The iPod Revolution Ultimate Boot CD can help when disaster strikes Windows Firewall with Advanced Security – How To Guides

    Read the article

  • Linux to Solaris @ Morgan Stanley

    - by mgerdts
    I came across this blog entry and the accompanying presentation by Robert Milkoski about his experience switching from Linux to Oracle Solaris 11 for a distributed OpenAFS file serving environment at Morgan Stanley. If you are an IT manager, the presentation will show you: Running Solaris with a support contract can cost less than running Linux (even without a support contract) because of technical advantages of Solaris. IT departments can benefit from hiring computer scientists into Systems Programmer or similar roles.  Their computer science background should be nurtured so that they can continue to deliver value (savings and opportunity) to the business as technology advances. If you are a sysadmin, developer, or somewhere in between, the presentation will show you: A presentation that explains your technical analysis can be very influential. Learning and using the non-default options of an OS can make all the difference as to whether one OS is better suited than another.  For example, see the graphs on slides 3 - 5.  The ZFS default is to not use compression. When trying to convince those that hold the purse strings that your technical direction should be taken, the financial impact can be the part that closes the deal.  See slides 6, 9, and 10.  Sometimes reducing rack space requirements can be the biggest impact because it may stave off or completely eliminate the need for facilities growth. DTrace can be used to shine light on performance problems that may be suspected but not diagnosed.  It is quite likely that these problems have existed in OpenAFS for a decade or more.  DTrace made diagnosis possible. DTrace can be used to create performance analysis tools without modifying the source of software that is under analysis.  See slides 29 - 32. Microstate accounting, visible in the prstat output on slide 37 can be used to quickly draw focus to problem areas that affect CPU saturation.  Note that prstat without -m gives a time-decayed moving average that is not nearly as useful. Instruction level probes (slides 33 - 34) are a super-easy way to identify which part of a function is hot.

    Read the article

  • Visual Studio 2010 Is Here!

    - by Bill Evjen
    I think back to the days of the first versions of Visual Studio (when it was called Visual Studio .NET, remember?) and I think about how far Microsoft has come with this IDE. It really is the best IDE on the market. There is so much to this IDE it is amazing. It now can really handle managing your complete software application development lifecycle. For me, it is (besides Windows 7) the best and most successful product Microsoft has developed. You can obviously get this now and it is available on MSDN and some other places: MSDN Visual Studio Trial Editions Visual Studio 2010 Express Editions (free) You will also find great info at the Visual Studio Developer Center. Some other interesting tidbits of info: JetBrain’s ReSharper 5.0 has been released for VS2010 Oracle will have the new Oracle Dev Tools for VS2010 within one month - http://bit.ly/9gC9NE Visual Studio 64-bit - Why there is no 64-bit version of VS - http://bit.ly/dhhwAj In installing this version of Visual Studio, if you have been working on the previous RC builds, then you are going to want to uninstall these previous editions of the 2010 product. You can do this through the Add Remove Programs dialog and you are going to want to select the appropriate item from the long list of Visual Studio items. You are then going to want to step through the Visual Studio dialog (it will seem as if you are installing it again) – and you will then come to a point where you can select the option to Uninstall the entire application. If you have installed the Silverlight 4 RC stuff, then you are also going to want to uninstall this and you are also going to want to uninstall the “Update for Visual Studio 2010 (KB976272)” before installing Silverlight RC2 – which you can find on www.silverlight.net. Technorati Tags: vs2010,.net,visualstudio,microsoft

    Read the article

  • TP-LINK-8901 modem does not connect automatically; only works in bridged mode [closed]

    - by Arash
    Possible Duplicate: Cannot Connect To Internet In Automatic (pppoe) Mode Of Modem My TP-LINK-8901 modem does not connect to internet with "always on" (automatic) connection. I have to put it in bridge mode and make a connection in Windows to connect to internet. Also, there is no configuration problem. What I did: Reset modem Update firmware Is there any way to solve the problem myself, e.g. through hardware or software? I can't have it repaired because the warranty already expired.

    Read the article

  • How Visual Studio 2010 and Team Foundation Server enable Compliance

    - by Martin Hinshelwood
    One of the things that makes Team Foundation Server (TFS) the most powerful Application Lifecycle Management (ALM) platform is the traceability it provides to those that use it. This traceability is crucial to enable many companies to adhere to many of the Compliance regulations to which they are bound (e.g. CFR 21 Part 11 or Sarbanes–Oxley.)   From something as simple as relating Tasks to Check-in’s or being able to see the top 10 files in your codebase that are causing the most Bugs, to identifying which Bugs and Requirements are in which Release. All that information is available and more in TFS. Although all of this tradability is available within TFS you do need to understand that it is not for free. Well… I say that, but if you are using TFS properly you will have this information with no additional work except for firing up the reporting. Using Visual Studio ALM and Team Foundation Server you can relate every line of code changes all the way up to requirements and back down through Test Cases to the Test Results. Figure: The only thing missing is Build In order to build the relationship model below we need to examine how each of the relationships get there. Each member of your team from programmer to tester and Business Analyst to Business have their roll to play to knit this together. Figure: The relationships required to make this work can get a little confusing If Build is added to this to relate Work Items to Builds and with knowledge of which builds are in which environments you can easily identify what is contained within a Release. Figure: How are things progressing Along with the ability to produce the progress and trend reports the tractability that is built into TFS can be used to fulfil most audit requirements out of the box, and augmented to fulfil the rest. In order to understand the relationships, lets look at each of the important Artifacts and how they are associated with each other… Requirements – The root of all knowledge Requirements are the thing that the business cares about delivering. These could be derived as User Stories or Business Requirements Documents (BRD’s) but they should be what the Business asks for. Requirements can be related to many of the Artifacts in TFS, so lets look at the model: Figure: If the centre of the world was a requirement We can track which releases Requirements were scheduled in, but this can change over time as more details come to light. Figure: Who edited the Requirement and when There is also the ability to query Work Items based on the History of changed that were made to it. This is particularly important with Requirements. It might not be enough to say what Requirements were completed in a given but also to know which Requirements were ever assigned to a particular release. Figure: Some magic required, but result still achieved As an augmentation to this it is also possible to run a query that shows results from the past, just as if we had a time machine. You can take any Query in the system and add a “Asof” clause at the end to query historical data in the operational store for TFS. select <fields> from WorkItems [where <condition>] [order by <fields>] [asof <date>] Figure: Work Item Query Language (WIQL) format In order to achieve this you do need to save the query as a *.wiql file to your local computer and edit it in notepad, but one imported into TFS you run it any time you want. Figure: Saving Queries locally can be useful All of these Audit features are available throughout the Work Item Tracking (WIT) system within TFS. Tasks – Where the real work gets done Tasks are the work horse of the development team, but they only as useful as Excel if you do not relate them properly to other Artifacts. Figure: The Task Work Item Type has its own relationships Requirements should be broken down into Tasks that the development team work from to build what is required by the business. This may be done by a small dedicated group or by everyone that will be working on the software team but however it happens all of the Tasks create should be a Child of a Requirement Work Item Type. Figure: Tasks are related to the Requirement Tasks should be used to track the day-to-day activities of the team working to complete the software and as such they should be kept simple and short lest developers think they are more trouble than they are worth. Figure: Task Work Item Type has a narrower purpose Although the Task Work Item Type describes the work that will be done the actual development work involves making changes to files that are under Source Control. These changes are bundled together in a single atomic unit called a Changeset which is committed to TFS in a single operation. During this operation developers can associate Work Item with the Changeset. Figure: Tasks are associated with Changesets   Changesets – Who wrote this crap Changesets themselves are just an inventory of the changes that were made to a number of files to complete a Task. Figure: Changesets are linked by Tasks and Builds   Figure: Changesets tell us what happened to the files in Version Control Although comments can be changed after the fact, the inventory and Work Item associations are permanent which allows us to Audit all the way down to the individual change level. Figure: On Check-in you can resolve a Task which automatically associates it Because of this we can view the history on any file within the system and see how many changes have been made and what Changesets they belong to. Figure: Changes are tracked at the File level What would be even more powerful would be if we could view these changes super imposed over the top of the lines of code. Some people call this a blame tool because it is commonly used to find out which of the developers introduced a bug, but it can also be used as another method of Auditing changes to the system. Figure: Annotate shows the lines the Annotate functionality allows us to visualise the relationship between the individual lines of code and the Changesets. In addition to this you can create a Label and apply it to a version of your version control. The problem with Label’s is that they can be changed after they have been created with no tractability. This makes them practically useless for any sort of compliance audit. So what do you use? Branches – And why we need them Branches are a really powerful tool for development and release management, but they are most important for audits. Figure: One way to Audit releases The R1.0 branch can be created from the Label that the Build creates on the R1 line when a Release build was created. It can be created as soon as the Build has been signed of for release. However it is still possible that someone changed the Label between this time and its creation. Another better method can be to explicitly link the Build output to the Build. Builds – Lets tie some more of this together Builds are the glue that helps us enable the next level of tractability by tying everything together. Figure: The dashed pieces are not out of the box but can be enabled When the Build is called and starts it looks at what it has been asked to build and determines what code it is going to get and build. Figure: The folder identifies what changes are included in the build The Build sets a Label on the Source with the same name as the Build, but the Build itself also includes the latest Changeset ID that it will be building. At the end of the Build the Build Agent identifies the new Changesets it is building by looking at the Check-ins that have occurred since the last Build. Figure: What changes have been made since the last successful Build It will then use that information to identify the Work Items that are associated with all of the Changesets Changesets are associated with Build and change the “Integrated In” field of those Work Items . Figure: Find all of the Work Items to associate with The “Integrated In” field of all of the Work Items identified by the Build Agent as being integrated into the completed Build are updated to reflect the Build number that successfully integrated that change. Figure: Now we know which Work Items were completed in a build Now that we can link a single line of code changed all the way back through the Task that initiated the action to the Requirement that started the whole thing and back down to the Build that contains the finished Requirement. But how do we know wither that Requirement has been fully tested or even meets the original Requirements? Test Cases – How we know we are done The only way we can know wither a Requirement has been completed to the required specification is to Test that Requirement. In TFS there is a Work Item type called a Test Case Test Cases enable two scenarios. The first scenario is the ability to track and validate Acceptance Criteria in the form of a Test Case. If you agree with the Business a set of goals that must be met for a Requirement to be accepted by them it makes it both difficult for them to reject a Requirement when it passes all of the tests, but also provides a level of tractability and validation for audit that a feature has been built and tested to order. Figure: You can have many Acceptance Criteria for a single Requirement It is crucial for this to work that someone from the Business has to sign-off on the Test Case moving from the  “Design” to “Ready” states. The Second is the ability to associate an MS Test test with the Test Case thereby tracking the automated test. This is useful in the circumstance when you want to Track a test and the test results of a Unit Test designed to test the existence of and then re-existence of a a Bug. Figure: Associating a Test Case with an automated Test Although it is possible it may not make sense to track the execution of every Unit Test in your system, there are many Integration and Regression tests that may be automated that it would make sense to track in this way. Bug – Lets not have regressions In order to know wither a Bug in the application has been fixed and to make sure that it does not reoccur it needs to be tracked. Figure: Bugs are the centre of their own world If the fix to a Bug is big enough to require that it is broken down into Tasks then it is probably a Requirement. You can associate a check-in with a Bug and have it tracked against a Build. You would also have one or more Test Cases to prove the fix for the Bug. Figure: Bugs have many associations This allows you to track Bugs / Defects in your system effectively and report on them. Change Request – I am not a feature In the CMMI Process template Change Requests can also be easily tracked through the system. In some cases it can be very important to track Change Requests separately as an Auditor may want to know what was changed and who authorised it. Again and similar to Bugs, if the Change Request is big enough that it would require to be broken down into Tasks it is in reality a new feature and should be tracked as a Requirement. Figure: Make sure your Change Requests only Affect Requirements and not rewrite them Conclusion Visual Studio 2010 and Team Foundation Server together provide an exceptional Application Lifecycle Management platform that can help your team comply with even the harshest of Compliance requirements while still enabling them to be Agile. Most Audits are heavy on required documentation but most of that information is captured for you as long a you do it right. You don’t even need every team member to understand it all as each of the Artifacts are relevant to a different type of team member. Business Analysts manage Requirements and Change Requests Programmers manage Tasks and check-in against Change Requests and Bugs Testers manage Bugs and Test Cases Build Masters manage Builds Although there is some crossover there are still rolls or “hats” that are worn. Do you thing this is all achievable? Have I missed anything that you think should be there?

    Read the article

  • How do I structure code and builds for continuous delivery of multiple applications in a small team?

    - by kingdango
    Background: 3-5 developers supporting (and building new) internal applications for a non-software company. We use TFS although I don't think that matters much for my question. I want to be able to develop a deployment pipeline and adopt continuous integration / deployment techniques. Here's what our source tree looks like right now. We use a single TFS Team Project. $/MAIN/src/ $/MAIN/src/ApplicationA/VSSOlution.sln $/MAIN/src/ApplicationA/ApplicationAProject1.csproj $/MAIN/src/ApplicationA/ApplicationAProject2.csproj $/MAIN/src/ApplicationB/... $/MAIN/src/ApplicationC $/MAIN/src/SharedInfrastructureA $/MAIN/src/SharedInfrastructureB My Goal (a pretty typical promotion pipeline) When a code change is made to a given application I want to be able to build that application and auto-deploy that change to a DEV server. I may also need to build dependencies on Shared Infrastructure Components. I often also have some database scripts or changes as well If developer testing passes I want to have an manually triggered but automated deploy of that build on a STAGING server where end-users will review new functionality. Once it's approved by end users I want to a manually triggered auto-deploy to production Question: How can I best adopt continuous deployment techniques in a multi-application environment? A lot of the advice I see is more single-application-specific, how is that best applied to multiple applications? For step 1, do I simply setup a separate Team Build for each application? What's the best approach to accomplishing steps 2 and 3 of promoting latest build to new environments? I've seen this work well with web apps but what about database changes

    Read the article

  • How to persist changes to instances in the cloud.

    - by Peter NUnn
    Hi folks, I must be missing something here, but can someone clue me in on how to persist changes (such as software installs etc) on machines in the cloud (either EC2 or my own Eucalyptus cloud). I have instances running.. can attach extra disks to them etc., but every time I terminate the instance, all of my changes are lost the next time I run them. Now, this sort of makes sense in that the instances are virtual, but, there must be some way to make these changes persist. I'm just missing how its done. Thanks. Peter.

    Read the article

  • Ubuntu - connecting 3rd monitor fails with "xrandr: cannot find crtc for output DVI-0"

    - by MDCore
    I've got a laptop with a DVI and VGA output on the back. With everything connected it will only allow me to run 2 of the 3 monitors e.g. laptop display + VGA or DVI+VGA but not all 3. xrandr says I have 2 CRTC's, 0 and 1. The internet says I should be able to share a CRTC if the modeline is the same, and my 2 external monitors are the exact same make and model. How do I convince the software to drive all this hardware?

    Read the article

  • Remote login/access on windows

    - by acidzombie24
    Hi I was wondering what software I can use to access my and other machines remotely? I used ssh which is nice but i don't know how it would be like on windows. (I assume its the same idea but windows console instead of a bash terminal?) Windows has a lot of applications that require GUI/MouseClicks. Actually I don't know a single ssh or vpn command line installer not that i'm complaining (but is helpful if you can mention some). I haven't use a VPN, is this taking control of a users screen/session? Or is it another instance/session as if you logged in as a different user on that box? What solutions are at my disposal for windows? (7)

    Read the article

  • Can't enable wireless lan on Fujitsu Siemens A1665G with Ubuntu 11.10 installed

    - by Theo
    I saw my old Notebook yesterday and wanted to make that work again. On Windows XP the wireless worked still fine. Then I installed new Ubuntu 11.10 32bit and I'm sadly not able to make the wireless enabled. [I replaced Win XP entirely] lspci lists following network: 08:0a.0 Network controller: Broadcom Corporation BCM4318 [AirForce One 54g] 802.11g Wireless LAN Controller (rev 02) So after recommendation from this link I installed the b43 firmware module. iwconfig prints the following: wlan0 IEEE 802.11bg ESSID:off/any Mode:Managed Access Point: Not-Associatd Tx-Power=off Retry long limit:7 RTS thr:off Fragment thr:off Power Management: off As you can see, my wireless lan adapter is not turned on. sudo iwconfig wlan0 txpower auto Doesn't change anything. Then I tried to make it work with rfkill. rfkill list 0: phy0: Wirless LAN Soft blocked: no Hard blocked: yes sudo rfkill unblock all rfkill list 0: phy0: Wireless LAN Soft blocked: no Hard blocked: yes remains the same. The question is now, how I could enable the hard blocked wireless LAN. There is no hardware switch for wlan integrated. However there is a button to change the state. I always thought this would be software sided, but it seems to make some hardware changes as well... The wireless LED is also not blinking (as it did on windows xp) I reset bios and searched for some settings in there, but it has only a few options and nothing to do with wireless settings, nothing works here.. At last I tried to install the acer hotkeys but I was not able to manage that. I installed the acerhkgui package, but in initializing progress, it was not able to compile acer hotkeys for my machine. There was a message that asm/linkage.h was not found while compiling. Do you have any ideas what I could do to make this hard blocked stuff disabled and my wireless card work? PS: I also tried sudo rm /dev/rfkill and a reboot to reinit that stuff... No success :(

    Read the article

  • The spork/platypus average: shameless self promotion

    - by Roger Hart
    This is the video of presentation I gave at UA Europe and TCUK this year. The actual sub-title was "Content strategy at Red Gate Software", but this heading feels more honest. For anybody who missed it, or is just vaguely interested, here's a link to me talking about de-suckifying the web. You can find the slideshare deck here, too* Watching it back is more than a little embarrassing, and makes me really, really want to do a follow up, so I can do three things: explain the rest of the big web project, now we've done it give some data on the outcome of the content review make a grovelling apology to our marketing guys, who I've been unfairly mean to in a childish effort to look cool There are a whole bunch of other TCUK presentations online, too. You can find them all here: http://tiny.cc/tcuk10_videos I'd particularly recommend Chris Atherton's: "Everything you always wanted to know about psychology and technical communication" - it's full of cool stuff. You should probably also watch David Black's opening keynote, which managed to make my hour of precocious grandstanding look measured, meek, and helpful. He actually makes some interesting points, but you'd basically have to ship Richard Dawkins off to Utah, if you wanted to go further out of your way to aggravate your audience. It does give an engaging account of running a large tech comms project, and raise some questions about how we propose to understand a world where increasing amounts of our stuff gets done by increasingly many increasingly complicated tissues of APIs. Well, sort of. That's what all the notes I made were about, anyway.   *Slideshare ate my fonts. Just so we're clear on this: I'd never use badly-kerned Arial in a presentation. Don't worry.

    Read the article

  • Dual Core or Quad Core CPU for NetBeans/Eclipse development?

    - by cdb
    I am going to buy a new desktop CPU. I am a programmer who mainly uses NetBeans IDE for Java web application development, with GlassFish application server. I went through the discussion regarding Dual Core or Quad Core. My doubt is that software like IDEs (NetBeans, Eclipse, etc., with a server running) may not be written with multiple cores in mind? I am not a game addict... So what is best for me, and which company should I choose, AMD/Intel?

    Read the article

  • General questions regarding open-source licensing

    - by ndg
    I'm looking to release an open-source iOS software project but I'm very new to the licensing side of the things. While I'm aware that the majority of answers here will not lawyers, I'd appreciate it if anyone could steer me in the right direction. With the exception of the following requirements I'm happy for developers to largely do whatever they want with the projects source code. I'm not interested in any copyleft licensing schemes, and while I'd like to encourage attribution in derivative works it is not required. As such, my requirements are as follows: Original source can be distributed and re-distributed (verbatim) both commercially and non-commercially as long as the original copyright information, website link and license is maintained. I wish to retain rights to any of the multi-media distributed as part of the project (sound effects, graphics, logo marks, etc). Such assets will be included to allow other developers to easily execute the project, but cannot be re-distributed in any manner. I wish to retain rights to the applications name and branding. Futher to selecting an applicable license, I have the following questions: The project makes use of a number of third-party libraries (all licensed under variants of the MIT license). I've included individual licenses within the source (and application) and believe I've met all requirements expressed in these licenses, but is there anything else that needs to be done before distributing them as part of my open-source project? Also included in my project is a single proprietary, close-sourced library that's used to power a small part of the application. I'm obviously unable to include this in the source release, but what's the best way of handling this? Should I simply weak-link the project and exclude it entirely from the Git project?

    Read the article

  • Windows Server 2012 R2 application installation slowness

    - by Eric
    Good afternoon, Windows Server 2012 R2 as host fully up to date Windows Server 2012 R2 as guest fully up to date Whether I am running it as a host or guest, whenever I am installing an application that should take a couple minutes it always ends up 3x-5x longer than it should. If I take the same applications and install it on any other OS (IE 2k8, 7,) it they install very quickly. Dell server management software took 30 minutes or so when it takes 5 minutes or so on 2k8. Even a clean install not joined to domain I still experience this same slowness on applications I attempt to install. So my question to everyone is, are you or anyone else experiencing this? I cannot find anything on google so this is why I am here

    Read the article

  • How do you stay productive when dealing with extremely badly written code?

    - by gaearon
    I don't have much experience in working in software industry, being self-taught and having participated in open source before deciding to take a job. Now that I work for money, I also have to deal with some unpleasant stuff, which is normal of course. Recently I was assigned to add logging to a large SharePoint project which is written by some programmer who obviously was learning to code on the job. After 2 years of collaboration, the client switched to our company, but the damage was done, and now somehow I need to maintain this code. Not that the code was too hard to read. Despite problems - each project has one class with several copy-pasted methods, enormous if nestings, Systems Hungarian, undisposed connections — it's still readable. However, I found myself absolutely unproductive despite working on something as simple as adding logging. Basically, I just need to go through the code step by step and add some trace calls. However, the idiocy of the code is so annoying that I get tired within 10 minutes of starting. In the beginning, I used to add using constructs, reduce nesting by reversing if's, rename the variables to readable names—but the project is large, and eventually I gave up. I know this is not the task I should be doing, but at least reducing the mess gave me some kind of psychological reward so I could keep going. Now the trick stopped working, and I still have 60% of my work to do. I started having headaches after work, and I no longer get the feeling of satisfaction I used to get - which would usually allow me to code for 10 hours straight and still feel fresh. This is not just one big rant, for I really do have an actual question: Is there a way to stay productive and not to fight the windmills? Is there some kind of psychological trick to stay focused on the task, instead of thinking “How stupid is that?” each time I see another clever trick by the previous programmer? The problem with adding logging is that I actually have to understand what the code does, and doing so hurts my brain in an unpleasant fashion.

    Read the article

  • Where can I learn various hacking techniques on the web?

    - by Carson Myers
    I would like to try my hand at hacking -- that is, exploiting various website vulnerabilities. Not for any illegal purpose mind you, but so I can have a better understanding and appreciation of these exploits while writing my own web software. I seem to recall that there was a community that hosted a bunch of demo websites, and you had to find and exploit certain vulnerabilities with each one. I can't remember what it is called but this is the sort of thing I am looking for -- I have read a tonne of little XSS and CSRF examples but have yet to find a real-life hands-on example of one. Does anyone know of such a place, where I can be given an example page and look for security holes? I would really rather not try this with actual websites, I don't want to break any laws.

    Read the article

  • public key infrastructure - distribute bad root certificates

    - by iamrohitbanga
    Suppose a hacker launches a new Linux distro with firefox provided with it. Now a browser contains the certificates of the root certification authorities of PKI. Because firefox is a free browser anyone can package it with fake root certificates. Can this be used to authenticate some websites. How? Many existing linux distros are mirrored by people. They can easily package software containing certificates that can lead to such attacks. Is the above possible? Has such an attack taken place before?

    Read the article

  • Announcing IIS Community Newsletter

    - by steve schofield
    I'm excited to announce a newsletter for the IIS community is available.  Here is the link to signup.  The goal is to cover all happenings in the IIS community.  The goal is to have a monthly newsletter.  Sign-up for IIS Community Newsletterhttp://www.iislogs.com/newsletter/ A little history.  l previously was involved with Brett Hill and authoring the IIS Answers newsletter for about 1 year.  The newsletter literally reached thousands of IIS administrators providing up-to-date Microsoft and community related information.  It was an excellent source of information.  That is my goal with the IIS Community Newsletter. With everything, there is always some "geeking" that goes into it.  I'm using www.StarDeveloper.com newsletter application.   For a modest $10.00 fee, the application provides a simple, yet powerful way to manage newsletters.  I added a CAPTCHA feature to sign-up.  The CAPTCHA module I used was provided free by MONDOR software.  I personally never used one on a application and it was easy to implement.  Thanks for sharing!  Cheers, Steve SchofieldMicrosoft MVP - IIS

    Read the article

  • Customized windows xp installation

    - by user23950
    How to create a customized version of an xp installation cd. I want a hand's off installation that will install all the themes, drivers , and applications I need without my intervention. How can I do that. I tried to make use of this software called Nlite. But when I tested the iso file using virtual box. The virtual box said that there's a certain file that cannot be copied. And by the way, before I customized the original iso image of windows xp sp3, it doesn't have the file copying problem that I mentioned above.

    Read the article

  • Just Another Web Service (JAWS) vs SOA

    Over the last few years SOA has been a hot topic lending it to be abused by many that have no understanding of the concept. In my opinion, one of the largest issues facing SOA is the lack of understanding and experience implementing SOA by business and IT alike. I just recently deployed a new web services that is called by multiple service clients. Would you call this SOA because it is a web service that can be called by any requesting client? In my opinion, this is not SOA; instead it is Just Another Web Service (JAWS).  Just because a company creates a web service does not mean that they are using SOA, in fact it only means that they are using a web service. SOA is an architectural style that focuses on the design of systems based on the consumer and providers thorough the use of contracts.  With this approach SOA needs to be applied for the top down in order for it to reach its full potential. In the case of the web service, the service is just a small part of the entire system that is reusable and has the flexibility to change. In order for a company in this case to move towards SOA then they need to define business processes that can be shared through the use of reusable software and loose coupling. Once the company’s thought and development process change to address changes in this manner they can start to become more SOA.

    Read the article

< Previous Page | 554 555 556 557 558 559 560 561 562 563 564 565  | Next Page >