Search Results

Search found 21053 results on 843 pages for 'process'.

Page 344/843 | < Previous Page | 340 341 342 343 344 345 346 347 348 349 350 351  | Next Page >

  • Quantitfying a cost for a software project

    - by The Elite Gentleman
    Disclaimer: I didn't know exactly where to put this question. If you feel that this question is not suitable for Programmers @ StackExchange, feel free to migrate it. Background: Broadening my last question, there is a request for tender for a software system that's open and I have decided to take it on. I am a software developer & engineer by profession and, in this tender process, I have to put on the pricing for my bid. I have been provided a documentation consisting of functional and non-functional requirements only. I have to put a project manager's cap on and think of all aspects, e.g. cost for implementation for the project, resources needed, etc. My question is: Is there a project framework that I can follow that breaks the project cycle into steps and corresponding cost aspect or how would I go about best calculating/approximating the cost for the project?

    Read the article

  • Nameservers and migrating a VPS

    - by MeltingDog
    I am primarily a front end developer who has been tasked with upgrading my companies VPS. As far as I understand, this is just the process of obtaining a new VPS with WHM/CPanel and then migrating the existing accounts over to the new VPS, testing the sites out, then pointing the DNS to the new nameserver records. That sounds pretty straightforward. What I am having trouble understanding is how to set up the new nameservers on the new VPS. How do I obtain/establish the new nameserver records for the new, blank VPS?

    Read the article

  • When will my old page stop appearing on Google?

    - by Bane
    I recently bought a new address for my Blogger blog, from yannbane.blogspot.com to www.yannbane.com. However, www.yannbane.com addresses do not appear when they are searched for! Is this natural? How much time will it take for Google to update its index? yannbane.blogspot.com 301's to www.yannbane.com. Both are added to my Webmaster Tools account, but it shows no data for www.yannbane.com (strangely). And, finally, is there something I could do to speed up the process?

    Read the article

  • Powershell Run-As Script

    - by marc dekeyser
    Disclaimer: This script is not of my own making. I found it on a share somewhere and it is so handy I started using in a bunch of scripts. To the writer: If you're out there, somewhere, when you see this, thank you! Check if script is running as Adminstrator and if not use RunAs    # Use Check Switch to check if admin        param([Switch]$Check)        $IsAdmin = ([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()`        ).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator")            if ($Check) { return $IsAdmin }        if ($MyInvocation.ScriptName -ne "")    {         if (-not $IsAdmin)         {             try            {                 $arg = "-file `"$($MyInvocation.ScriptName)`""                Start-Process "$psHome\powershell.exe" -Verb Runas -ArgumentList $arg -ErrorAction 'stop'             }            catch            {                Write-Warning "Error - Failed to restart script with runas"                 break                          }            exit # Quit this session of powershell        }     }     else     {         Write-Warning "Error - Script must be saved as a .ps1 file first"         break     } write-host "Script Running As Administrator" -foregroundcolor redWrite-host ""

    Read the article

  • Which can I use to make GTK themes?

    - by tutuca
    I'm trying to make a new gtk theme using the murrine engine, using Humanity (default in ubuntu 9.10) as a template. You can grab the code in http://github.com/tutuca/themes However, I found cumbersome the process of creating a new theme with it. There is no central starting point. The documentation of both, the engine options (gtkrc's and stuff), and general theming practices (the format of the index.theme files, folders, bla bla) is scarce, How to's and tutorials are often old or subject to lots of opinionated debate and results confusing (to me, having a web developer background, at least :-). So... I wanted to ask to the fellows gtk themers and artist out there: Which tools you use to create a new theme, and how does your average workflow looks like?

    Read the article

  • Physics from other games

    - by Carlosrdz1
    I'm making a platform engine with XNA Game Studio, and I've solved almost everything about colliding stuff. But now, I'm searching for good physics for the player, I'm trying to emulate characters from other games like Mario from Super Mario World, or MegaMan X... do you know a website or something, where the physics from that games are revealed? I remember seen a page with something like that. Or what's the process you think is the best to emulate physics from other games? Just trial and error? Thank you.

    Read the article

  • Ubuntu chroot “No such file or directory”

    - by Paris
    Hi there. I hace a web application where I create some folders on my server and put executables there. Then I try to wun them with chroot but I get a message that access is denied there. I tried chroot -r 777 blah blah.... and then I get a message that the folder or the file that I call (sudo chroot mydirectory myfile_inside_Mydirectory) does not exist. This happens only when I call chroot on folders created by the web server. My web application is in php and I use: shell_exec("cp -R /var/www/comp/prison/bin $dir"); shell_exec("cp -R /var/www/comp/prison/lib $dir"); shell_exec("cp /var/www/janitor.out $dir/janitor.out"); shell_exec("sudo chmod -R 777 $dir"); $process = proc_open("sudo chroot $dir janitor.out", $descriptorspec, $pipes); sudo does not need password.

    Read the article

  • SQL Peer-to-Peer Dynamic Structured Data Processing Collaboration

    Unstructured and XML semi-structured data is now used more than structured data. But fixed structured data still keeps businesses running day in and day out, which requires consistent predictable highly principled processing for correct results. For this reason, it would be very useful to have a general purpose SQL peer-to-peer collaboration capability that can utilize highly principled hierarchical data processing and its flexible and advanced structured processing to support dynamically structured data and its dynamic structured processing. This flexible dynamic structured processing can change the structure of the data as necessary for the required processing while preserving the relational and hierarchical data principles. This processing will perform freely across remote unrelated peer locations anytime and transparently process unpredictable and unknown structured data and data type changes automatically for immediate processing using automatic metadata maintenance.

    Read the article

  • New Procurement Report for Transportation Sourcing

    - by John Murphy
    Welcome to our fourth annual transportation procurement benchmark report. American Shipper, in partnership with the Council of Supply Chain Management Professionals (CSCMP) and the Retail Industry Leaders Association (RILA), surveyed roughly 275 transportation buyers and sellers on procurement practices, processes, technologies and results. Some key findings: • Manual, spreadsheet-based procurement processes remain the most prevalent among transportation buyers, with 42 percent of the total • Another 25 percent of respondents use a hybrid platform, which presumably means these buyers are using spreadsheets for at least one mode and/or geography • Only 23 percent of buyers are using a completely systems-based approach of some kind • Shippers were in a holding pattern with regards to investment in procurement systems the past year • Roughly three-quarters of survey respondents report that transportation spend has increased in 2012, although the pace has declined slightly from last year’s increases • Nearly every survey respondent purchases multiple modes of transportation • The number of respondents with plans to address technology to support the procurement process has increased in 2012. About one quarter of respondents who do not have a system report they have a budget for this investment in the next two years.

    Read the article

  • I want to consolidate two sites into a third. Will my search engine rankings be penalized if I rewrite and redirect pages one by one?

    - by Patrick Kenny
    I have two Drupal sites with different content-- let's call them Apple and Orange. I recently developed a much more sophisticated third Drupal site-- let's call it Tree. For a large number of reasons, the content on Apple and Orange is useful for the users of Tree, so I want to move the content to Tree. However, much of the content is out of date. (This whole process took about five years.) To update the content, I will rewrite it one article at a time myself. Now here's my question: if I move the articles one by one (as I rewrite them) and then redirect the old articles (using a 301 redirect) on Apple/Orange to the new site on Tree, will this have a huge negative effect on my search engine rankings? Is there a good way to redirect among sites when they merge like this, or would I be better off keeping the old articles on Apple/Orange and simply linking them to the new, rewritten articles on Tree?

    Read the article

  • Ask the Readers: How Do You Set Up a Novice-Proof Computer?

    - by Jason Fitzpatrick
    You’re into technology, you like tweaking and tinkering with computers, and, most importantly, you know how to keep your computer from turning into a virus-laden and fiery wreck. What about the rest of your family and friends? How do you set up a novice-proof computer to keep them secure, updated, and happy? It’s no small task protecting a computer from an inexperienced user, but for the benefit of both the novice and the innocent computer it’s an important undertaking. This week we want to hear all about your tips, tricks, and techniques for configuring the computers of your friends and relatives to save them from themselves (and keep their computer running smoothly in the process). Sound off in the comments with your tricks and check back in on Friday for the What You Said roundup to add see how your fellow readers get the job done. How Hackers Can Disguise Malicious Programs With Fake File Extensions Can Dust Actually Damage My Computer? What To Do If You Get a Virus on Your Computer

    Read the article

  • More about JuJu cache

    - by koxon
    I would like to know more about JuJu cache. After I start a charm and JuJu provisions a machine for the first time, the charm will be downloaded and all dependencies installed (apt-get, etc). This process can be very long. Once the charm has been build, configured and deployed once, can JuJu provision more instances of the same charm pre-built and configured? I presume that's what the cache is for, but the documentation is not very explicit: https://juju.ubuntu.com/docs/charms-deploying.html How JuJu tracks the state of the charm if it works that way? Thanks Nic

    Read the article

  • Can't print, CUPS package corrupted and hangs on re-install

    - by Little Bobby Tables
    When I upgraded to Ubuntu 10.4 (Maverick), the upgrade process got stuck on the post-installation of the CUPS package. I had to kill processes and run several forced updates before I could finally get regular updated. Ever since I can't print - The printed file gets messed up and crashes the printer. I also can't re-install CUPS, as each time the installation hangs and I have to kill it before it completes. I tried to find a workaround for this problem, but in vain. Does anyone know how to bypass this? Or at least why can the post-installation hang, and how to re-install a problematic package? Some system specs and other hints: Dell D630 laptop running Ubuntu 10.4, Gnome desktop, standard LAN network, printing to an LPD server. Everything worked fine on 9.10. Also, the printed files themselves are not corrupted. The problem does not seem to be Evince-specific, but common to all printouts.

    Read the article

  • What framework & technology would you use to make a site like fancy.com [on hold]

    - by adriancdperu
    Im about to start a 3 month process to build something similar to fancy.com. What are your opinions about: best language + framework? (server - side) = I was thinking about LAMP and CakePHP important technological issues to consider when developing? MAU assumption = 1 million. Main features: Registration with Facebook Normal registration Search articles Like articles Post articles Comment on articles Suggest articles to friends via mail, Facebook, Twitter and about 3 or 4 more apis Ranking of articles as a cron job done every minute, many criterias, many rankings Follow users Datamining users to mail everyday with articles they have high probability of liking Operation tools for admins to add articles and close user accounts Mobile focus

    Read the article

  • Upgrading visual studio with Crystal Reports

    - by jkrebsbach
    In the process up updating an app from Visual Studio 2003 to VS 2008.  It happens to have a couple dozen crystal reports that it executes regarly. Upgraded visual studio to 2008, and when attempting to generate the reports an exception was thrown. A significant portion of the rendering engine for Crystal Reports is not coming from Crystal, it's coming from Visual Studio and those methods and properties have changed over the years.  I needed to upgrade the report generating methods from the VS 2003 way of doing things to the VS 2008 way for the report to generate successfully. Not only that, but this means that while we were previously rendering with Crystal 9 in VS 2003, Visual Studio 2008 will render per Crystal 10, which treats things like column widths in Excel different (by default, at least) so now we have to go through all of our reports and compare outputs for Crystal just to upgrade the Visual Studio environment that I foolishly believed would  not be affected.

    Read the article

  • Advice for transitioning into a Java developer position from doing other types of web development?

    - by Rick
    I've been doing web programming and currently work at a company working with mainly PHP and Javascript. For a little while now I've been becoming more and more frustrated with the shortcomings of this type of development and want to move to a company with a more defined development process that values doing things the "right way" such as using Unit tests, dependency injection / IoC, etc. I've been learning JEE / Java as much as I can on my own time but would really like to make a switch to doing this as my career and leave behind the PHP world altogether. I'm just wondering if anyone can give me advice on which things to put my main focus on right now to make myself marketable as an entry level Java developer. Basically, I feel that I'm not really learning anything new at my current job that will benefit me and its only making me more and more frustrated so I figure if there is any way to position myself for a transition I would rather do it sooner than later. Thanks for any advice.

    Read the article

  • A bacon- (and module-) saving PowerShell incident

    - by AaronBertrand
    Earlier today I made a big goof. I opened a module in Notepad, intending to use it as the basis for a new module. I was in the process of using "File > Save As" when my phone rang just at the precise instant that, for some reason, made me click on "File > Save" by mistake. After hitting Ctrl+Z 30 times to try to get the old version of the module back, I remembered that Notepad has never had more than one level of Undo. Back when I was coding ASP by hand, I was very well aware of this, but I...(read more)

    Read the article

  • ffmpeg: cut multiple input files with seeking to one output file

    - by Josef Kufner
    I have list of video files (loaded from database), each with start and end time of requested interval: # file begin end v1.mp4 1:01 2:01 v2.mp4 3:02 3:32 v3.mp4 2:03 5:23 And I need to create single video file containing these intervals: [0:00]---v1---[2:00]---v2---[2:30]---v3---[5:50] I preffer usig ffmpeg, since it is installed on server. Caller program is written in PHP. It is easy to cut one input to one output (argument escaping removed for clarity): exec("ffmpeg -ss $begin -i $input_file -ss $begin -c copy $output_file"); I there any easier way than executing ffmpeg for each interval and then execute it once more to concatenate prepared clips together? I really do not like to have a lot of temporary files or dealing with complex process handling.

    Read the article

  • JCP Survey!

    - by Yolande Poirier
    The London Java Community (LJC), which is an Executive Committee member of the Java Community Process (JCP), is asking Java developers to participate in a JCP survey titled "What should the JCP be doing?"  The JCP is the mechanism that decides on future standards related to Java technology. Those standards give users like you a choice of technologies to develop with and more independence from vendor solutions.   The JCP cares about community feedback and has successfully encouraged community participation using transparent tracking processes. Take the survey, your feedback matters. 

    Read the article

  • Webcor Builders Coordinates Construction Schedules and Mitigates Potential Delays More Efficiently with Integrated Project Management

    - by Sylvie MacKenzie, PMP
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} With more than 40 years of commercial construction experience, Webcor Builders is a leading builder of distinguished, high-profile projects, including high-rise condominiums and hotels, laboratories, healthcare centers, and public works projects. Webcor is also known for its award-winning concrete, interior construction, historic restoration, and seismic renovation work. The company has completed more than 50 million square feet of projects to date. Considering the variety and complexity of the construction projects Webcor undertakes, an integrated project management solution is critical to ensuring optimal efficiency and completing client projects on time and on budget. The company previously used a number of scheduling systems for its various building projects. These packages provided different levels of schedule detail and required schedulers, engineers, and other employees to learn multiple systems. From an IT cost and complexity perspective, the company had to manage multiple scheduling systems and pay for multiple sets of licenses. The company looked to standardize on an enterprise project management system, and selected Oracle’s Primavera P6 Enterprise Project Portfolio Management. Webcor uses the solution’s advanced capabilities to schedule complex projects, analyze delays, model and propose multiple scenarios to demonstrate and mitigate delays and cost overruns, and process that information efficiently to deliver the scheduling precision that public and private projects require. In fact, the solution was instrumental in helping the company’s expansion into public sector projects during the recent economic downturn, and with Primavera P6 in place, it can deliver the precise schedule reporting required for large public projects. With Primavera P6 in place, the company could deliver the precise scheduling and milestone reporting capabilities required for large public projects. The solution is in managing the high-profile University of California – Berkeley Memorial Stadium project. Webcor was hired as construction manager and general contractor for the stadium renovation project, which is a fast-paced project located near the seismically active Hayward Fault Zone. Due to the University of California’s football schedule, meeting the Universities deadline for the coming season placed Webcor in a situation where risk awareness and early warnings of issues would be paramount. Webcor and the extended project team needed a solution that could instantly analyze alternate scenarios to mitigate potential delays; Primavera would deliver those answers.The team would also need to enable multiple stakeholders to use an internet-based platform to access the schedule from various locations, and model complicated sequencing requirements where swift decisions would be made to keep the project on track. The schedule is an integral part of Webcor’s construction management process for the stadium project. Rather than providing the client with the industry-standard monthly update, Webcor updates the critical path method (CPM) schedule on a weekly basis. The project team also reviews the schedule and updates weekly to confirm that progress and forecasted performance are accurate. Hired by the University for their ability to deliver in high risk environments The Webcor team was hit recently with a design supplement that could have added up to 70 days to the project. Using Oracle Primavera P6 the team sprung into action analyzing multiple “what if” scenarios to review mitigation means and methods.  Determined to make sure the Bears could take the field in the coming season the project team nearly eliminated the impact with their creative analysis in working the schedule. The total time from the issuance of the final design supplement to an agreed mitigation response was less than one week; leveraging the Oracle Primavera solution Webcor was able to deliver superior customer value With the ability to efficiently manage projects and schedules, Webcor can ensure it completes its projects on time and on budget, as well as inform clients about what changes to plans will mean in terms of delays and additional costs. Read the complete customer case study at :  http://www.oracle.com/us/corporate/customers/customersearch/webcor-builders-1-primavera-ss-1639886.html

    Read the article

  • Game has noticeable frame drops but when through a profiler it always runs smooth

    - by felipedrl
    I'm trying to optimize my PC game but I can find the bottleneck since every time I run it through a profiler (gDEBugger) it runs smooths. When running outside gDEBugger I get these annoying hiccups. It's not just the graphics, the sound also gets choppy. The drops are inconsistent across runs, i.e, sometimes I run the same scenario and get no drops at all, sometimes I get a few drops, and others the game is consistently slow. The only constant is: when running through gDEBugger I ALWAYS get a smooth run. I'm suspecting something outside my game is interfering and causing these drops, but what in the hell does gDEBugger do that nullifies these drops? A higher process priority? Any ideas? Thanks in advance.

    Read the article

  • In centralized version control, is it always good to update often?

    - by janos
    Assuming that: You are in a team developing some software. Your team is using centralized version control in the development process. You are working on a new feature which will surely take several days to complete, and you won't be able to commit before that because it would break the build. Your team members commit something every day that affects some of the files you're working with for your fancy new feature. Since this is centralized version control, you will have to update your local checkout at some point: at least once right before committing the new feature. If you update only once right before your commit, then there might be a lot of conflicts due to the many other changes by your teammates, which could be a world of pain to resolve all at once. Or, you could update often, and even if there are a few conflicts to resolve day by day, it should be easier to do, little by little. Can we say that it is always a good idea to update often?

    Read the article

  • A Patent for Workload Management Based on Service Level Objectives

    - by jsavit
    I'm very pleased to announce that after a tiny :-) wait of about 5 years, my patent application for a workload manager was finally approved. Background Many operating systems have a resource manager which lets you control machine resources. For example, Solaris provides controls for CPU with several options: shares for proportional CPU allocation. If you have twice as many shares as me, and we are competing for CPU, you'll get about twice as many CPU cycles), dedicated CPU allocation in which a number of CPUs are exclusively dedicated to an application's use. You can say that a zone or project "owns" 8 CPUs on a 32 CPU machine, for example. And, capped CPU in which you specify the upper bound, or cap, of how much CPU an application gets. For example, you can throttle an application to 0.125 of a CPU. (This isn't meant to be an exhaustive list of Solaris RM controls.) Workload management Useful as that is (and tragic that some other operating systems have little resource management and isolation, and frighten people into running only 1 app per OS instance - and wastefully size every server for the peak workload it might experience) that's not really workload management. With resource management one controls the resources, and hope that's enough to meet application service objectives. In fact, we hold resource distribution constant, see if that was good enough, and adjust resource distribution if that didn't meet service level objectives. Here's an example of what happens today: Let's try 30% dedicated CPU. Not enough? Let's try 80% Oh, that's too much, and we're achieving much better response time than the objective, but other workloads are starving. Let's back that off and try again. It's not the process I object to - it's that we to often do this manually. Worse, we sometimes identify and adjust the wrong resource and fiddle with that to no useful result. Back in my days as a customer managing large systems, one of my users would call me up to beg for a "CPU boost": Me: "it won't make any difference - there's plenty of spare CPU to be had, and your application is completely I/O bound." User: "Please do it anyway." Me: "oh, all right, but it won't do you any good." (I did, because he was a friend, but it didn't help.) Prior art There are some operating environments that take a stab about workload management (rather than resource management) but I find them lacking. I know of one that uses synthetic "service units" composed of the sum of CPU, I/O and memory allocations multiplied by weighting factors. A workload is set to make a target rate of service units consumed per second. But this seems to be missing a key point: what is the relationship between artificial 'service units' and actually meeting a throughput or response time objective? What if I get plenty of one of the components (so am getting enough service units), but not enough of the resource whose needed to remove the bottleneck? Actual workload management That's not really the answer either. What is needed is to specify a workload's service levels in terms of externally visible metrics that are meaningful to a business, such as response times or transactions per second, and have the workload manager figure out which resources are not being adequately provided, and then adjust it as needed. If an application is not meeting its service level objectives and the reason is that it's not getting enough CPU cycles, adjust its CPU resource accordingly. If the reason is that the application isn't getting enough RAM to keep its working set in memory, then adjust its RAM assignment appropriately so it stops swapping. Simple idea, but that's a task we keep dumping on system administrators. In other words - don't hold the number of CPU shares constant and watch the achievement of service level vary. Instead, hold the service level constant, and dynamically adjust the number of CPU shares (or amount of other resources like RAM or I/O bandwidth) in order to meet the objective. Instrumenting non-instrumented applications There's one little problem here: how do I measure application performance in a way relating to a service level. I don't want to do it based on internal resources like number of CPU seconds it received per minute - We need to make resource decisions based on externally visible and meaningful measures of performance, not synthetic items or internal resource counters. If I have a way of marking the beginning and end of a transaction, I can then measure whether or not the application is meeting an objective based on it. If I can observe the delay factors for an application, I can see which resource shortages are slowing an application enough to keep it from meeting its objectives. I can then adjust resource allocations to relieve those shortages. Fortunately, Solaris provides facilities for both marking application progress and determining what factors cause application latency. The Solaris DTrace facility let's me introspect on application behavior: in particular I can see events like "receive a web hit" and "respond to that web hit" so I can get transaction rate and response time. DTrace (and tools like prstat) let me see where latency is being added to an application, so I know which resource to adjust. Summary After a delay of a mere few years, I am the proud creator of a patent (advice to anyone interested in going through the process: don't hold your breath!). The fundamental idea is fairly simple: instead of holding resource constant and suffering variable levels of success meeting service level objectives, properly characterise the service level objective in meaningful terms, instrument the application to see if it's meeting the objective, and then have a workload manager change resource allocations to remove delays preventing service level attainment. I've done it by hand for a long time - I think that's what a computer should do for me.

    Read the article

  • gpg passcode error

    - by shantanu
    I just created my gpg key using developer.ubuntu.com packaging page's documentation and also upload the public key in key server. When i import the key into launchpad it sent me an encrypted message. Now i give a command to decrypt the message gpg -d launchpad.txt it asked for a passcode. I gave one which i used to create gpg key. but it shows error again again. I repeat the process(gpg creation) again but stuck on .... passcode(decryption). What is the problem?

    Read the article

  • How can I get Ubuntu 12.04 to boot on a Gigabyte 990Fx MB (efi problem?)

    - by Jeffrey
    I am trying to install Ubuntu 12.04 on a home made system with a gigabyte 990Fxa MB. I can make it through the install process. It does not boot. Windows 7 does boot on the same machine. Suse 11.4 boots on the same machine. Suse 12.4 does not boot. I think there may be an issue with the EFI / GPT system. I know very little about these systems. I really expected the machine to boot. What can I do to get the system to boot? Please direct me to a path to trouble shoot this problem. thanks Jeffrey

    Read the article

< Previous Page | 340 341 342 343 344 345 346 347 348 349 350 351  | Next Page >