Search Results

Search found 20799 results on 832 pages for 'long integer'.

Page 457/832 | < Previous Page | 453 454 455 456 457 458 459 460 461 462 463 464  | Next Page >

  • Backup Script - Could Not Open Input File

    - by Iestyn
    this is the backup script that I've got going: http://pastebin.com/4g4E6wUz This is the cron info: /usr/local/bin/php /home/backups/backup-db.php --filename-dated ALL No matter what I do, I keep on getting this error: Could not open input file: /home/backups/backup-db.php - That's the correct location of the file. I just don't know what else to try, I feel I've been working on this for so long now that I've explored every avenue, on the other hand sometimes I think that the time I've spent on it is clouding my thoughts and I'm missing something stupidly obvious. Just wondering if someone can give me a few pointers? Also on a last note, does anyone know of a way/article to auto generate a full backup of cPanel every * amount of days and store it in a location that I want? Kind Regards.

    Read the article

  • An experiment: unlimited free trial

    - by Alex.Davies
    The .NET Demon team have just implemented an experiment that is quite a break from Red Gate's normal business model. Instead of the tool expiring after the trial period, it now continues to work, but with a new message that appears after the tool has saved you a certain amount of time. The rationale is that a user that stops using .NET Demon because the trial expired isn't doing anyone any good. We'd much rather people continue using it forever, as long as everyone that finds it useful and can afford it still pays for it. Hopefully the message appearing is annoying enough to achieve that, but not for people to uninstall it. It's true that many companies have tried it before with mixed results, but we have a secret weapon. The perfect nag message? The neat thing for .NET Demon is that we can easily measure exactly how much time .NET Demon has saved you, in terms of unnecessary project builds that Visual Studio would have done. When you press F5, the message shows you the time saved, and then makes you wait a shorter time before starting your application. Confronted with the truth about how amazing .NET Demon is, who can do anything but buy it? The real secret though, is that while you wait, .NET Demon gives you entertainment, in the form of a picture of a cute kitten. I've only had time to embed one kitten so far, but the eventual aim is for a random different kitten to appear each time. The psychological health benefits of a dose of kittens in the daily life of the developer are obvious. My only concern is that people will complain after paying for .NET Demon that the kittens are gone.

    Read the article

  • Xorg crashes ever since using Nvidia dual monitors

    - by legion
    Well ever since I have used dual monitors with the NVIDIA X Server Settings program my xorg process has been crashing after a while, and its generally a pretty long while of like 6 hours afterwards. Before NVIDIA changed my xorg.conf file I only had xorg crash like twice in 2 months, I can't figure out what is going on. I am running ubuntu 12.04 with the MATE desktop environment v 1.2.0 xorg.conf # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 295.33 (buildd@allspice) Fri Mar 30 15:25:24 UTC 2012 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "LEN" HorizSync 51.8 - 55.8 VertRefresh 40.0 - 60.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "NVS 3100M" EndSection Section "Screen" # Removed Option "TwinView" "0" # Removed Option "metamodes" "DFP-0: nvidia-auto-select +0+0" # Removed Option "TwinView" "1" # Removed Option "metamodes" "DFP-1: nvidia-auto-select +0+0, CRT: nvidia-auto-select +1920+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinViewXineramaInfoOrder" "DFP-1" Option "TwinView" "0" Option "metamodes" "DFP-0: nvidia-auto-select +0+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • Sample domain model for online store

    - by Carel
    We are a group of 4 software development students currently studying at the Cape Peninsula University of Technology. Currently, we are tasked with developing a web application that functions as a online store. We decided to do the back-end in Java while making use of Google Guice for persistence(which is mostly irrelevant for my question). The general idea so far to use PHP to create the website. We decided that we would like to try, after handing in the project, and register a business to actually implement the website. The problem we have been experiencing is with the domain model. These are mostly small issues, however they are starting to impact the schedule of our project. Since we are all young IT students, we have virtually no experience in the business world. As such, we spend quite a significant amount of time planning the domain model in the first place. Now, some of the issues we're picking up is say the reference between the Customer entity and the order entity. Currently, we don't have the customer id in the order entity and we have a list of order entities in the customer entity. Lately, I have wondered if the persistence mechanism will put the client id physically in the order table, even if it's not in the entity? So, I started wondering, if you load a customer object, it will search the entire order table for orders with the customer's id. Now, say you have 10 000 customers and 500 000 orders, won't this take an extremely long time? There are also some business processes that I'm not completely clear on. Finally, my question is: does anyone know of a sample domain model out there that is similar to what we're trying to achieve that will be safe to look at as a reference? I don't want to be accused of stealing anybody's intellectual property, especially since we might implement this as a business.

    Read the article

  • Are all SFP+ tranceivers usable for FEX between Nexus 5000 and Nexus 2000?

    - by Alain O'Dea
    I am looking at building a network with Nexus 5000 parent switches and Nexus 2000 fabric extenders. The mystery at the moment is what kind of SFP+ tranceivers are required for cross-connecting racks. Right now I am considering FET-10G, but I am not sure that 100m is long enough given the separation between racks is potentially very large since it is a rented rack environment. Are all SFP+ tranceivers usable for FEX between Nexus 5000 and Nexus 2000? Specifically, can SFP-10G-SR transceivers be used for longer distance FEX?

    Read the article

  • Is it safe to run an operating system from an USB flash drive?

    - by Georg
    I've got a laptop that has a broken harddisk controller. Replacing the motherboard is quite expensive. I thought about buying a flash drive and installing & running the system from it. However, I'm concerned about some things. Speed: Are they fast enough for swap memory (I've got only 1GB RAM installed.) I'm considering buying 2 or 3 of them and making them into a RAID. What about limited write cycles? How long will it last for a system that has a filesystem with journaling enabled? I'd hate to abandon it. Are there significant differences between internal SSD which are used in modern laptops like MacBooks and USB flash drives? What should I expect in 10 years when the memory wear starts kicking in?

    Read the article

  • Why is Ubuntu One slow to sync in 11.10, either backup or any sub-folder contents?

    - by pst007x
    I have been trying to sync my documents folder of 1.4GB, it still hasn't worked and it has been syncing for a month. The top level syncs, files and folders in the Document folders, but contents of sub-folders just hang. (Gave up and stopped syncing this folder) However,I have tried using the backup facility in 11.10, to backup to Ubuntu One.... I upgraded my HDD space in Ubuntu One. It has been going now for 24hours-ish and only backed up what looks like a couple of percent. (By the way what an excellent idea to backup to Ubuntu One, if only we could get it to actually work! :-o) The odd thing is I can sync to drop box within hours, rather than months. This is bad, and has been an issue since Ubuntu One's release. I have reported this problem and there were promises in later releases this would be fixed, but it hasn't. Canonical cannot help either... I posted on several blogs, a lot of people have the same problem but no fixes. So do I use dropbox or another service, until it is sorted, as Ubuntu does not seem to see this as an issue, I think a fix will be a long time in coming. (However,I love the potential of Ubuntu One and the integration with the OS) Yes my internet speeds are fine, etc... :-) No firewall (sudo ufw status: STATUS: INACTIVE), No Proxy, etc NB: I have raised this as a separate question to others posted here, because my question relates to Ubuntu 11.10, though I have commented elsewhere for help. Plus my question also relates to deja-dup backup to Ubuntu One. Thanks

    Read the article

  • Adventures in Lab Management Configuration: Part 3 of 3

    - by Enrique Lima
    This is long overdue.  But here it is. In the previous two sections I have discussed on how I got a CMMI v4.2 to take on the same fields as v5 and therefore be able to communicate with MTM and Lab Manager.  And that was quite a success. Yet when I opened up Lab Management while it was fully aware of the VMs being there, it refused to let me enroll them into an environment.  It kept stating there was no suitable host to deploy the VM to, error TF259115. This was an indication something was not matching the expected network configuration between TFS and Hyper-V/SCVMM. So, here are a couple of things that took place: Verified the network segment specified for network isolation matched what was configured physically for either DHCP or manually assigned IP addressing for the guest VMs Made sure TFS was fully aware of the configuration settings for the network location name.  For that I issued:  tfsconfig lab /settings /networklocation:”<name of the network location configured in SCVMM” On that last item, that was key to making sure Lab Management communicated with the VMs and for it to allow enrollment into the new Virtual Environment.

    Read the article

  • How do I get others to see past my prior inexperience?

    - by Kevin
    My core question is how do I proceed from the following predicament. I will be honest with you, I wasted my College Experience. I slacked off and didn't take any of my comp sci classes that seriously, somehow i still got out with a 3.25 GPA. But truth be told I learned nothing. I befriended most of my professors who went pretty lenient on me in terms of grading. However, I basically came out of College knowing how to program a simple calculator in VB.Net. I was (to my great surprise) hired by a very large respected company in Denver as a Junior developer. Well the long and the short of it is that I knew so little about programming that I quickly became the office pariah and was almost fired due to my incompetence. It has been 8 months now and I feel I have learned some basic things and I am not as picked on as I used to be by the other developers. However, everyone hates me and the first few months have given the other developers a horrible perception of me. I am no longer afraid of code or learning, but I have put my self in the precarious position of being the scapegoat of our department. I hate going to work every day because no one there is my friend and pretty much everyone is hostile to me. What should I do? Any advice?

    Read the article

  • After installing Windows what should I do first? update or install antivirus?

    - by EApubs
    Normally, after reformating and installing Windows 7, I used to go online and install all the updates, install all the driver updates and then install the anti virus. Because long ago, when I installed the anti virus first, applying windows updates crashed the AV! So, I install it last. Specially 7 sp1 is critical right? But now im having doubts... Going online without an antivirus means I'm vulnerable! (I have a home router which have a small firewall but I'm not sure about it) So, whats the best thing to do? Install the anti virus first or install the updates first?

    Read the article

  • Sucking Less Every Year ?

    - by AdityaGameProgrammer
    Sucking Less Every Year A trail of thought that had been on my mind for a while Quoting directly from the post I've often thought that sucking less every year is how humble programmers improve. You should be unhappy with code you wrote a year ago. If you aren't, that means either A) you haven't learned anything in a year, B) your code can't be improved, or C) you never revisit old code. All of these are the kiss of death for software developers. How often does this happen or not happen to you? How long before you see an actual improvement in your coding ? month, year? Do you ever revisit Your old code? How often does your old code plague you? or how often do you have to deal with your technical debt. It is definitely very painful to fix old bugs n dirty code that we may have done to quickly meet a deadline and those quick fixes ,some cases we may have to rewrite most of the application/code. No arguments about that. Some of the developers i had come across argued that they were already at the evolved stage where their coding doesn't need improvement or cant get improved anymore. Does this happen? If so how many years into coding on a particular language does one expect this to happen?

    Read the article

  • Logitech MX3200 keyboard stopped working

    - by Roy Rico
    I have a Logitech MX3200 wireless keyboard and mouse set, and keyboard stopped working. I was working with it and it just stopped in the middle of using it. Some notes: The mouse still works flawlessly, i've tried to reconnect the KB i've tried to reconnect everything, mouse reconnects, keyboard reconnects (or pretends to) I've tried to hard reset (holding the connect button on the receiver for 10 seconds). I've tried new batteries I took out the batteries out of the KB & Mouse and unplugged the receiver over night. The media keys still work (launches winamp) but they main keys does not At this point, i've tried everything I've seen online. I just want to know is my keyboard dead in the water? I would say it's been about 8 months to a year since i've bought it, and the articles i find on the internet all say that's as long as they last... is this true? what's everyone's experience? I hope it's not true.

    Read the article

  • Vista - perform scheduled actions only if screen is not locked

    - by Syntax Error
    Ok, here's the general idea of what I want to do. After a certain time, I would like the computer to nag me to go to sleep. Maybe every five minutes or so. But I don't want the messages to pop up if the screen is locked, because I leave it like that all night. Ideally I would like to be able to do more things like shut down running instances the web browser, or lock my user session if I ignore the notices for too long. But I'm happy with just popup messages if that's all I can do. So, how much of this is possible and where do I start? I'm not too well versed with task scheduler, and I'm assuming I'll use that to at least start whatever script I have to put together.

    Read the article

  • Empty Recycle Bin error "Cannot Delete Dc12: Access denied."

    - by Chris Noe
    The Dc number can vary. The error is a sporadic, but when it happens it prevents the contents of the recycle bin from being deleted. It can also occur when the recycle bin appears to be empty, yet it has the crumpled paper indicator. Rebooting makes the problem go away, but it can also magically go away by just waiting a long time, like over night. But the problem keeps recurring with no rhyme or reason. What is causing this? I really don't want to reinstall Windows.

    Read the article

  • Pros and Cons of Facebook's React vs. Web Components (Polymer)

    - by CletusW
    What are the main benefits of Facebook's React over the upcoming Web Components spec and vice versa (or perhaps a more apples-to-apples comparison would be to Google's Polymer library)? According to this JSConf EU talk and the React homepage, the main benefits of React are: Decoupling and increased cohesion using a component model Abstraction, Composition and Expressivity Virtual DOM & Synthetic events (which basically means they completely re-implemented the DOM and its event system) Enables modern HTML5 event stuff on IE 8 Server-side rendering Testability Bindings to SVG, VML, and <canvas> Almost everything mentioned is being integrated into browsers natively through Web Components except this virtual DOM concept (obviously). I can see how the virtual DOM and synthetic events can be beneficial today to support old browsers, but isn't throwing away a huge chunk of native browser code kind of like shooting yourself in the foot in the long term? As far as modern browsers are concerned, isn't that a lot of unnecessary overhead/reinventing of the wheel? Here are some things I think React is missing that Web Components will care of. Correct me if I'm wrong. Native browser support (read "guaranteed to be faster") Write script in a scripting language, write styles in a styling language, write markup in a markup language. Style encapsulation using Shadow DOM React instead has this, which requires writing CSS in JavaScript. Not pretty. Two-way binding

    Read the article

  • proxy software that supports parallel transfer

    - by est
    Hi guys, I need to setup a really fast proxy server in a remote server, here's the scenario: The server prefetches 3KB of data, mostly HTTP resources. The server send to client 3KB of data, instead of traditional HTTP or SOCKS proxy, the server open multithreaded transfer with 3 connections, send 1KB of data per thread to each connection The client receives 1KBx3, and combine them to the original 3KB data, and return as a local HTTP proxy server. The client display the original data in browser via the local HTTP proxy The latency is not important as long as the transfer rate is good. Does any software like this exist? It's better if it's open source or free ones.

    Read the article

  • What are your "must have", free (gratis), programs?

    - by flybywire
    Poll: What software must you always keep handy? I don't care if it is open source, freeware, or demo, as long as its price is $0. Neither do I care if it is for desktops, handhelds, netbooks, web based, cellphones. If it is free to use – and essential to your happiness and well-being – put it in this list. Rules: Please, list only ONE application per answer, so that people can vote up the items that they prefer. Please do not post applications that have already been posted - instead, up-vote the existing answer.

    Read the article

  • How important is Domain knowledge vs. Technical knowledge?

    - by Mayank
    I am working on a Trading and Risk Management application and although from a C# background, I have been asked to work on SSIS packages. Now I can live with that. The pain point is that there is too much emphasis on business understanding. Trading (Energy Trading to be exact) is a HUGE area and understanding every little bit of it is overwhelming. But for the past two months I have been working on understanding the business terms - Mark To Market, Risk Metrics, Positions, PnL, Greeks, Instruments, Book Structure... every little detail (you get the point). Now IMHO, this is the job of a BA. Sure it is very important for developers to understand the business but where do you draw the line? When I talked to my manager about this, he almost mocked me by saying that anybody can learn a technology in a week. It's the business that's harder. My long term aspiration is to remain on the technical side, probably become an architect (if possible). If I wanted to focus so much on business I would have pursued an MBA! I want to know if I am wrong or too naive in understanding the business importance or is my frustration justified?

    Read the article

  • backing up ntfs disk using rsync on ubuntu

    - by user70366
    For a long time I was using windows. I have a separate drive I use to keep copies of my media files, photos etc. on, which I periodically backup to an external drive. In Windows I used SyncToy to do this. After my Windows stopped booting, I decided to switch to Linux (Ubuntu 10.10). That seems to be going fine, but now I want to backup my drive to the external drive like before. Mostly the two drives will be already the same with maybe about 10GB of extra files added. So I try to use rsync to synchronise the two drives like this: rsync --dry-run -rvlt --modify-window=1 /media/Antonio1TB/Backup /media/FREECOM\ HDD/Backup The problem is the dry run indicates that every file on the drive will be copied. Not just the files I have recently added. What is the correct command to synch two NTFS drives under Ubuntu so that files that already exist don't get copied again? Thanks.

    Read the article

  • I have to shard a mysql database. I want to start with 12 shards on 2 machines. What is the best w

    - by Tim
    All tables are InnoDb. I would rather not use mysqldump, because the shard sizes will be about 200 GB (about 700 million rows), and that will take too long. I was hoping to just stop mysql for an hour, copy the data files to a new machine, and start back up. But you can't do this with InnoDb, as some data is in the shared tablespace. Even if I have the innodb_file_per_table option set. This is not a website, but a custom application, used by tens of thousands right now, so uptime and performance are important. I suppose I could add logic into my server application to allow for gradual rebalancing / moving of a shard. Does anyone have a better idea?

    Read the article

  • Assuming "clean code/architecture" is there a difference in "effort" between PHP or Java/J2EE web application development?

    - by PhD
    A client asked us to estimate effort when selecting PHP as the implementation language for his next web-based application. We spent about a week exploring PHP, prototyping, testing etc., We are quite new to this language - may have hacked around it in the past but, let's go with PHP-noobs but application development experts (for the lack of a better, less flattering word :) It seems, that if we write, clean maintainable code, follow separation of concerns, enterprise architecture patters (DAOs etc.) the 'effort' in creating an object-oriented PHP based web-application seems to be the same for a Java based one. Here's our equation for estimating the effort (development/delivery time): ConstructionEffort = f(analysis, design, coding, testing, review, deployment) We were specifically comparing effort estimates in creating an enterprise application with the following: PHP + CakePHP/CodeIgniter (should we have considered others?) Java + Spring + Restlet It's an end-to-end application: Client: Javascript/jQuery + HTML/CSS Middle tier/Business Logic - (Still evaluating PHP/Java) Database: MySQL The effort estimates of the 1st and 3rd tier are constant and relatively independent of the middle tier's technology. At a high level with an initial breakdown into user stories of the requested features as well as a high-level SWAG on the sheer number of classes/SLOC that would be required for PHP doesn't seem to differ by much from what is required of the same in Java. Is this correct? We are basing our initial estimates on the initial prototyping/coding we've done with PHP - we are currently disregarding fluency with the language as a factor, since that'll be an initial hurdle and not a long term impediment IMHO (we also have sufficient time to become quite fluent with PHP). I'm interested in knowing the programmers' perspective with respect to effort when creating similar applications with either of the languages to justify choosing one over the other. Are we missing something here? It seems we are going against popular belief of PHP being quicker to market (or we being very fluent with Java have our vision clouded). It doesn't seem to have any coding/programming effort saving from what we/ve played around with.

    Read the article

  • Enable Ctrl (or Alt) + arrow keys to mimic 'home' and 'end' functionality

    - by YuKagi
    I am a long time Mac user and I'm now using a Ubuntu machine for development, and while I'm more or less used to a lot of the keyboard shortcuts, one thing I can't get used to is using the 'Home' and 'End' keys to move around lines of text. On a Mac you use "Command + right arrow" to go to the end of a line and "Command + left arrow" to go to the beginning. Is there a way to enable this kind of functionality in Linux? I'm not sure if this would be considered remapping, keyboard shortcuts, or what...

    Read the article

  • Question regarding Readability vs Processing Time

    - by Jordy
    I am creating a flowchart for a program with multiple sequential steps. Every step should be performed if the previous step is succesful. I use a c-based programming language so the lay-out would be something like this: METHOD 1: if(step_one_succeeded()) { if(step_two_succeeded()) { if(step_three_succeeded()) { //etc. etc. } } } If my program would have 15+ steps, the resulting code would be terribly unfriendly to read. So I changed my design and implemented a global errorcode that I keep passing by reference, make everything more readable. The resulting code would be something like this: METHOD 2: int _no_error = 0; step_one(_no_error); if(_no_error == 0) step_two(_no_error); if(_no_error == 0) step_three(_no_error); if(_no_error == 0) step_two(_no_error); The cyclomatic complexibility stays the same. Now let's say there are N number of steps. And let's assume that checking a condition is 1 clock long and performing a step doesn't take up time. The processing speed of Method1 can be anywhere between 1 and N. The processing speed of Method2 however is always equal to N-1. So Method1 will be faster most of the time. Which brings me to my question, is it bad practice to sacrifice time in order to make the code more readable? And why (not)?

    Read the article

  • What affects video encoding speeds?

    - by Pig Head
    FRAPs doesn't compress its videos when you record, so the files are enormous. In a long recording you can get up to a few hundred gigabytes. Obviously, usually you would need to convert/compress them. What affects the speed of this? I don't think the RAM does, as when I converted 600 gb my RAM usage only went to 6 gig, but the processor was at 100%, which is surprising as I have a 6 core processor @ 3.46 ghz. Would clock speed or cores help the most?

    Read the article

  • Hello PCI Council, are you listening?

    - by David Dorf
    Mention "PCI" to any retailer and you'll instantly see them take a deep breath and start looking for the nearest exit.  Nobody wants to be insecure, but few actually believe that PCI does anything more than focus blame directly on retailers.  I applaud PCI for making retailers more aware of the importance of security, but did you have to make them PAINFULLY aware?  POS vendors aren't immune to this pain either as we have to undergo lengthy third-party audits in addition to the internal secure programming programs.  There's got to be a better way. There's a timely article over at StorefrontBacktalk that discusses the inequity of PCI's rules, and also mentions that the PCI Council is accepting comments until April 15th. As a vendor, my biggest issue with PCI is that they require vendors to disclose the details of any breaches, in effect "ratting out" customers.  I don't think its a vendor's place to do this.  I'd rather have the trust of my customers so we can jointly solve the problem. Mary Ann Davidson, Oracle's Chief Security Officer, has an interesting blog posting on this very topic.  Its a bit of a long read, but I found it very entertaining and thought-provoking.  Here's an excerpt: ...heading up the list of “you must be joking” regulations are recent disturbing developments in the Payment Card Industry (PCI) world. I’d like to give [the] PCI kahunas the benefit of the doubt about their intentions, except that efforts by Oracle among others to make them aware of “unfortunate side effects of your requirements” – which is as tactful I can be for reasons that I believe will become obvious below - have gone, to-date, unanswered and more importantly, unchanged. I encourage you to read the entire posting, Pain Comes Instantly, and then provide feedback to the PCI Council.

    Read the article

< Previous Page | 453 454 455 456 457 458 459 460 461 462 463 464  | Next Page >