Search Results

Search found 28590 results on 1144 pages for 'best'.

Page 738/1144 | < Previous Page | 734 735 736 737 738 739 740 741 742 743 744 745  | Next Page >

  • Drive reporting incorrect free space

    - by Oli
    So I swapped my shiny SATA SSD for an even shinier PCI-E SSD. I run my core OS on the SSD because it's silly-fast. I did this on my old SSD so I created a new EXT4 partition and then just dded the data across (sorry I don't know the exact command I ran anymore) and after reinstalling grub, I booted onto the PCI-E SSD. At first glance everything had worked perfectly and things were running faster than ever. But then I noticed the free disk space on the new, larger drive: it was almost exactly the same as it was on the other disk... A disk that was half its size. So it looks as if I've copied the files across incorrectly and it's copied some of the filesystem metadata along with it. Tools like du and Disk Usage Analyzer come back with the correct figures. Things that look at the partition (and not the files) seem to think the drive is 120GB I've been using this drive for a week now so it's way out of sync with the old SSD so dumping the data and starting again isn't a job that fills me with joy but two questions: Is there a way to fix my filesystem so it knows what it's really on about? fsck e2fsck and badblocks all seem to be able to scan it without finding a problem with it. If I do plug my old SSD back in, copy the data off my PCI-E on to it and then copy it back onto a fresh filesystem (eg juggle the data around), what's the best way of doing that? I obviously want to keep all the permissions and softlinks where they are.

    Read the article

  • How to clean a computer with multiple accounts infected with spyware, viruses? [closed]

    - by DjKilla
    Possible Duplicate: What to do if my computer is infected by a virus or a malware? What's the best way to clean a computer with multiple accounts infected with spyware, viruses and malware? Should you install and run software to remove the infections on each account? If you install the software on one account, will it clean the entire computer including each account? For example, some programs like CCleaner will install only on one account and not offer the option for all users (accounts). Does this mean the program will clean the entire computer including other accounts or do I have to install CCleaner on each account to clean up each user's account?

    Read the article

  • Most effective way to change Linux command prompt for all users?

    - by incredimike
    I have several machines and the hostnames are really long.. i.e. companyname-ux-staging-web1.companyname.com. So my prompt looks something like [root@mycompany-ux-staging-web1 ~]# I'd like to shorten that up for all users on all machines with the least amount of work. From what I read I have a couple options, but they all have their drawbacks. I could change the hostname, but that would likely affect applications. Not a great choice. I could alter also $PS1 at login for all users by editing all .bashrc for existing users, and edit /etc/skel/.bashrc for potential new users. That's a lot of work across 10 machines. What's my best option or what have I overlooked?

    Read the article

  • How do i know if i set up the nlb (network load balancing) cluster correctly???

    - by letseatlunch
    So ill start from the very beginning. I'm working on a web conference were we are going to show about 12 videos for a total of about half a gig for all 12. Since all the participants are going to be watching (and also streaming/downloading) at once it was recommended we set up a server farm. So i have 4 servers that i am trying to network together. They are all running Microsoft Server 2008 and i have spent the last three days setting them up and now that its done i want to make sure its all ready to go. so i just want to be sure that everything is setup the way that i think it is. What is the best way to do this. Really i want to make sure that the load will be split over the servers when its showtime. thanks for any help in advance letseatlunch Dave

    Read the article

  • Aggressive Auto-Updating?

    - by MattiasK
    What do you guys think is best practice regarding auto-updating? Google Chrome for instance seems to auto-update itself as soon as it get's a chance without asking and I'm fine with it. I think most "normal" users benefits from updates being a transparent process. Then again, some more technical users might be miffed if you update their app without permission, as I see it there's 3 options: 1) Have a checkbox when installing that says "allow automatic updates" 2) Just have a preference somewhere that allows you to "disable automatic updates" so that you have to "check for updates manually" I'm leaning towards 2) because 1) feels like it might alienate non-technical users and I'd rather avoid installation queries if possible. Also I'm thinking about making it easy to downgrade if an upgrade (heaven forbid) causes trouble, what are your thoughts? Another question, even if auto-updates are automatically, perhaps they should be announced. If there's new features for example otherwise you might not realize and use them One thing that kinda scares me though is the security implications, someone could theorically hack my server and push out spyware/zombieware to all my customers. It seems that using digital signatures to prevent man-in-the-middle attacks is the least you could do otherwise you might be hooked up to a network that spoofs the address of of update server.

    Read the article

  • What is the easiest way to copy Chrome's login/passses into KeePass without creating duplicates?

    - by ldigas
    Okey, here's the thing. I have most of my login info in two places; one is in Keepass file and the other is in Chrome. Being a lazy sort of person, and since Chrome/Keepass integration never really started to work the way it should, a couple times a year I use the Nirsoft tool to get the Chrome login/passwords into a textual .csv file and then import it in Keepass. Creating lots of duplicates in the process which I then clean and so on. In the meantime, all the new logins I accumulate just stay in Chrome. As you might notice, this is not really the best way to do it. Is there a faster way to do this; copy logins from Chrome to Keepass without creating duplicates in Keepass, or has anyone perhaps found a way to get Keepass to work with Chrome under Win XP SP3? Keepass 1.0 or 2.0, doesn't make the difference as long as it works.

    Read the article

  • Web Server Setup

    - by gustyaquino
    Hello, In my workplace, we want to implement your own web server for at leat 100 Apache/PHP/MySQL web pages. My boss is opposed to hiring skilled personnel, he think we can do ourselves. Currently, we are working with hostgator reseller account. I chose CentOS as the operating system, but I don't know the best hardware solution. HP, Dell ? What about the setup on these platforms? Thanks. PS: sorry for my bad english Edit: The purpose of this migration isn't related to performance issues. But independence.

    Read the article

  • What antivirus software supports updates without an internet connection?

    - by Michael Gundlach
    I'm putting antivirus software on Windows 7 computers in the middle of Africa. The computers don't have internet access, but still need to be protected against viruses from CDs and thumbdrives. Separate from these computers is one computer that does have extremely spotty internet access. What's the best AV software for this situation? The important part, as I see it, is that we need to keep the computers up to date, but can't let the AV software suck down updates at its leisure: the computers are disconnected, and getting emails onto the connected computer is a challenge enough. We thought we might transfer update files to the connected computer using a protocol that can handle repeated connection drops (e.g. FTP with resume.) Then we'd manually apply the update files to the disconnected computers. Does any AV software support this? Is there a better solution?

    Read the article

  • EXC_BAD_ACCESS error when box2d joint is destroyed

    - by colilo
    When I destroy the weldJoint in the update method (see below) I get an EXC_BAD_ACCESS error pointing to the line world->DestroyJoint(weldJoint); in the update method below: -(void) update: (ccTime) dt { int32 velocityIterations = 8; int32 positionIterations = 1; // Instruct the world to perform a single step of simulation. It is // generally best to keep the time step and iterations fixed. world->Step(dt, velocityIterations, positionIterations); // using the iterator pos over the set std::set<BodyPair *>::iterator pos; for(pos = bodiesForJoints.begin(); pos != bodiesForJoints.end(); ++pos) { b2WeldJointDef weldJointDef; BodyPair *bodyPair = *pos; b2Body *bodyA = bodyPair->bodyA; b2Body *bodyB = bodyPair->bodyB; weldJointDef.Initialize(bodyA, bodyB, bodyA->GetWorldCenter()); weldJointDef.collideConnected = false; weldJoint = (b2WeldJoint*) world->CreateJoint(&weldJointDef); // Free the structure we allocated earlier. free(bodyPair); // Remove the entry from the set. bodiesForJoints.erase(pos); } for(b2Body *b = world->GetBodyList(); b; b=b->GetNext()) { if (b->GetUserData() != NULL) { CCSprite *mainSprite = (CCSprite*)b->GetUserData(); if (mainSprite.tag == 1) { mainSprite.position = CGPointMake( b->GetPosition().x * PTM_RATIO, b->GetPosition().y * PTM_RATIO); CGPoint mainSpritePosition = mainSprite.position; if (mainSprite.isMoved) { world->DestroyJoint(weldJoint); } } } } } In the HelloWorldLayer.h I set the weldJoint with the assign property. Am I destroying the joint in the wrong way? I would really appreciate any help. Thanks

    Read the article

  • Is there a Google Authenticator desktop client?

    - by cwd
    I am using Google Authenticator for 2-step authentication. I like how I can use a code and verify my account using my phone: I realize that the app was designed to run on a device other than a computer to increase security for the computer (in case that it is lost or stolen), but I would like to know if there is a way I can run Google Authenticator on my Macbook. Now, per the Google Authenticator Page it will not run on a desktop: What devices does Google Authenticator work on? Android version 2.1 or later BlackBerry OS 4.5 - 6.0 iPhone iOS 3.1.3 or later However there are several emulators for developers and so I wonder if it is possible to run one of these emulators and then run Google Authenticator with that. I do realize this is not a best practice - but I'm less worried about my laptop getting stolen and more worried about someone just hacking the account. So my question is this: Is it possible to run it on the desktop, even though it is not meant to be / not recommended?

    Read the article

  • MS Access 2007 end user access

    - by LtDan
    I need some good advise. I have used Access for many years and I use Sharepoint but never the two combined. My newly created Access db needs to be shared with many users across the organization. The back end is SQL and the old way to distribute the database would be placing the db on a shared drive, connecting their PC ODBC connections to the SQL db and then they would open the database and have at it. This has become the OLD way. What is the best (and simpliest) way to allow the end users to utilize a frontend for data entry/edit reporting etc. Can I create a link through SharePoint and the user just open it from there. Your good advise is greatly approciated.

    Read the article

  • Offer me an ASP.NET & a SQL Server 2008 server specifications for about 2000 concurrent users, please.

    - by amkh
    We have a web application project wich will be created using ASP.NET 4.0, Entity Framework, and SQL Sever 2008 R2. To meet the needs, suppose a normal page of this application that has a query which it takes 10 miliseconds to response on a Core2 Quad @ 2.8GHz proccessor with 2x2GB of DDR3 Ram (EntityFramework overheads are considered). And we will have about 2000 concurrent user at peek times. So, what is the best recommended specifications (CPU/RAM/RAID/...) for the server which will be host this application? -- Or -- How can I calculate that?

    Read the article

  • How to take backup mirror copies of C: drive?

    - by metal gear solid
    I've installed everything on my C: Drive . Whatever i need Windows 7, updated drivers and utilities and software etc i need. I now i want to take a backup mirror of everything in a DVD or i can keep backup in another USB HDD. so in case if i face any windows or hard-drive failure in future then i can restore everything as it is as all are today. I don't want to reinstall everything again Windows, Drivers all utilities and all needed soft-wares. My C: Drive's total capacity is 108 GB but data on c: drive is only 12 GB. What Should i do ? What is the best solution for me? I need free solution.

    Read the article

  • ffmpeg fully html 5 converion supported

    - by user58542
    I need to know about library that required for ffmpeg for convert any format of audio and video for supported following format to have best configuration to convert audio and video files for html5 formats. I need for support mp3 and ogg for audio files. Also need for support FLV, H.264, Ogg Theora and VP8 (WebM) for video files. I'm using debian also using deb-multimedia repository. I need list of packages required for this formats(any format to this formats). also any configuration for install from package management or compile via latest ffmpeg repository. Thanks a lot.

    Read the article

  • What questions do I need to ask for a database sync?

    - by user65745
    I am currently helping to implement an RFID inventory management system for my company. The software that we are locked into has been at best buggy and unreliable. The software provider is now rolling out a major release. My problem is that the new software release keeps a local database on each machine that then syncs to a master database online. According to the software company we cannot do a scaled rollout because of data corruption issues between the software releases. What questions can I be asking and what sort of testing can I do on my end to make sure this software works? Any suggestions would be very helpful.

    Read the article

  • Make the recycle bin of the SSD on a RAID0 drive?

    - by Rolnik
    I don't know about you folks, but I hate the idea of junk sitting on my tiny 30GB SSD. Any way to designate another drive to be the host of the Recycle Bin for items formerly on the SSD? Basically, I need to know how to make a lower-priority drive receive the recycled materials from the 'main' drive, which happens to be short on space. The best thing I can think of is a batch file that a) syncs 'recycle' to another drive; and b) empties the recycle bin. ... but that's too much work for me.

    Read the article

  • Dependency diagramming / mapping tool [closed]

    - by Lars
    I am looking for a tool that allows me to easily create and maintain dependency maps of our mission critical servers, apps, processes, etc. It needs to be intuitive and easy to work with and be able to generate diagrams that clearly show the dependencies graphically. What would be some good tools for this? I have looked at videos for AssetGen Sysmap and BluePrint from Pathwaysystems.com, and they both seem to fit my needs, but there has got to be more good systems like them that I can look at. I want to make sure I pick the best system for our needs (and limited budget).

    Read the article

  • how to compare files/directories of 2 separate solaris boxes ?

    - by chz
    Hi Friends I have 2 solaris boxes and I need to check certain directories (on local filesystem and mounted nfs) to make sure that they match up on both boxes and to delete or move the other mismatches to elsewhere on the local filesystem. I investigated for unix commands like rsync, and tree but it appears that these commands are not supported on my Solaris boxes. What is the best approach to this problem with the least pain to solve it ? to use rsync, tree and then diff the outputs or find ? I have trouble limiting the find command to certain directories as there are mounted folders that contain too many xml files that I don't care to much in that directory. What's the find command to search multiple directory paths on a single find command. Thanks Sincerely

    Read the article

  • Use the &ldquo;using&rdquo; statement on objects that implement the IDisposable Interface

    - by mbcrump
    From MSDN : C#, through the .NET Framework common language runtime (CLR), automatically releases the memory used to store objects that are no longer required. The release of memory is non-deterministic; memory is released whenever the CLR decides to perform garbage collection. However, it is usually best to release limited resources such as file handles and network connections as quickly as possible. The using statement allows the programmer to specify when objects that use resources should release them. The object provided to the using statement must implement the IDisposable interface. This interface provides the Dispose method, which should release the object's resources. In my quest to write better, more efficient code I ran across the “using” statement. Microsoft recommends that we specify when to release objects. In other words, if you use the “using” statement this tells .NET to release the object specified in the using block once it is no longer needed.   So Using this block: private static string ReadConfig()         {             const string path = @"C:\SomeApp.config.xml";               using (StreamReader reader = File.OpenText(path))             {                 return reader.ReadToEnd();             }         }   The compiler converts this to: private static string ReadConfig1() {     StreamReader sr = new StreamReader(@"C:\SomeApp.config.xml");       try     {         return sr.ReadToEnd();     }     finally     {         if (sr != null)             ((IDisposable)sr).Dispose();     }   }

    Read the article

  • Angular JS - shop data disapears after using external payment script

    - by rZaaaa
    Im building a shopping cart in angular JS. till now all goes good but now i am at the checkout phase of y project. The problem is that im using external payment gateways such as ideal etc. when i checkout using for example Ideal the page redirects to the login page of the bank. All i have is a return url When i get to the return url al angular data is gone... I dont know how to do this properly. Also when i checkout and from the back page hit BACK again. the data is also gone and i have to do all the steps again, fill cart etc. So i gues i have to do something with sessions but what is the best way with angular JS how can i do this? The php backend is a slim framework. In the php version of my website i use the session generate id for the "lost" carts. is a user comes back, this session would be the same so i can retrieve his data (other session variables) ...

    Read the article

  • what will EcmaScript 6 bring to the table for us

    - by user697296
    Our company ported moderate chunks of business logic to JavaScript. We compile the code with a minifier, which further improves performance. Since the language is dynamically typed, it lends itself well to obfuscation, which occurs as a byproduct of minification. We went to great efforts to ensure it positively screams, performance-wise. We can now do what we did before, faster, better, with less code, on more platforms. In summary, we are very satisfied with the current state of the language. I personally love the language especially for its cross-platform nature. So naturally, I read up a lot about the state of JavaScript compilers, performance and compatibility across as many browsers and platforms as I have time to research. The one theme which has been growing louder and louder these days, is the news about ECMAScript 6. So far, what I have been able to gather is that ES6 promises a better development experience; firstly by enabling new ways to do things, secondly by reporting errors early. This sounds great for those who are still waiting for the language to meet their needs before jumping on board. But we have already jumped on board in a big way. Sure, I expect that we will have to do ongoing maintenance and feature revisions on our code through the years, and that we would obviously make use of best practices at the time. But I don't see us refactoring major portions of it to take advantage of language features that are mostly intended to boost developer productivity. I keep wondering, what impact will the language advances ultimately have on our existing, well-written, well-performing code base? Is there something I am missing? Is there something we ought to look out for? Does anyone have tips or guidance on how we should approach the ecmascript.next finalization? Should we care?

    Read the article

  • Turning a running Linux system into a KVM instance on another machine

    - by Charles
    I have two physical machines that I wish to virtualize. I can not (physically) plug the hard drives from either machine into the new machine that will act as their VM host, so I think that copying the entire structure of the system over using dd is out of the question. How can I best go about migrating these machines from their hardware to the KVM environment? I've set up empty, unformatted LVM logical volumes to host their filesystems, with the understanding that giving the VMs a real partition to work with achieves higher performance than sticking an image on the filesystem. Would I be better off creating new OS installs and rsyncing the differences over? FWIW, the two machines to be VM'd are running CentOS 5, and the host machine is running Ubuntu Server 10.04 for no particularly important reason. I doubt this matters too much, as it's still going to be KVM and libvert that matter.

    Read the article

  • Is it safe/wise to run Drupal alongside bespoke business web apps in production?

    - by Vaze
    I'm interested to know the general community feeling about the safety of running Drupal alongside bespoke, business critial ASP.NET MVC apps on a production server. Previously my employer's Drupal based 'visitor website' was hosted as a managed service with a 3rd party. While the LoB sites were hosted in-house. That 3rd party is no longer available so I'm considering my options: Bring Drupal in-house Find another 3rd party My concern is that I have little experience with Drupal administration (and no experience securing it) and that the addition of PHP to my IIS server poses a security risk. Is there a best practice that I can follow in this situation?

    Read the article

  • What's the easiest way to upgrade a single laptop hard drive?

    - by rutherford
    My HD is coming near the end of it's life and I wondered if producing an image of the original was all that was required? What's the best software to use for this and how can I accomplish it - can I use my other laptop for the image file or do I need to buy an external HD to accomplish the clone? Does having several partitions of both FAT32 and NTFS with partition magic installed affect things at all? Is it just a matter of making the image, swapping in the new HD and copying the image to that?

    Read the article

  • What service or software should I use to serve advertising on a site with about 120k monthly page views?

    - by JasonBirch
    I have a site that is generating about 120k monthly page views and is being hosted on a shared FreeBSD server where I have access to PHP and MySQL. I am using some custom PHP server-side scripts that give each of my ad networks (AdSense, Tribal Fusion, etc) an adjustable percentage of impressions in each of the ad positions on my pages. I am looking for a better way of managing and measuring the delivery of these ads, and would also like to be able to take direct placements and provide statistics to the clients. I am looking at options including OpenX self-hosted, OpenX community, and Google DoubleClick for Publishers Small Business (DFP), but am having difficulty determining which one will best meet my needs. They all seem to have pretty steep learning curves compared to my simple scripts. What I have taken away so far as the benefit of self-hosting is that I don't have to pay for the service if I exceed a maximum number of ad impressions, while both OpenX Community and DFP have free impression limits. Of course, if I was doing those kind of numbers I'd need to upgrade my hosting account, but I'm not sure even at that point whether it would be cheaper to serve the ads myself than pay for a premium service. Apart from this, I really need insights into what features differentiate these services, why I might want to choose one over another, and if there are any other competing products or service of the same quality that I should look into. Answers from webmasters who have used both (or all three) services and can talk to usability and ease of ad management would be highly appreciated.

    Read the article

< Previous Page | 734 735 736 737 738 739 740 741 742 743 744 745  | Next Page >