Search Results

Search found 15089 results on 604 pages for 'jeff home'.

Page 207/604 | < Previous Page | 203 204 205 206 207 208 209 210 211 212 213 214  | Next Page >

  • Prevent Windows 7 from installing a "critical" updates that crashes the system

    - by Sylverdrag
    I am running Windows 7 home premium. Up until yesterday, everything was going just beautifully, until I received a "critical update". As soon as the critical update was installed, the system became extremely unstable and froze up every few minutes requiring me to restart the system several times, with the same results. I used system restore to roll back the * update, and everything went back to normal... except that I keep getting a message telling me to restart the computer so that it can install this "critical update". There is no "don't install that crap on my machine" option. Disabling the updates doesn't change anything, turning off the windows update service doesn't change anything either, and being on Home premium, I have no access to the group policies. Any ideas of what can be done to prevent Windows from installing this update the next time I reboot?

    Read the article

  • Mercurial says hgrc is untrusted in Emacs, but works fine from the command line

    - by Ken
    I've got some Mercurial checkouts in a directory that was mounted by root. Mercurial is usually suspicious of files that aren't mine, but I'm the only user here, so I put: [trusted] users = root groups = root in my ~/.hgrc, and now I can use hg from the command line with no warnings or errors about anything being untrusted. So far, great. But when I try to run, say, vc-annotate in Emacs, I get an Annotate buffer that says: abort: unknown revision 'Not trusting file /home/me/.../working-copy/.hg/hgrc from untrusted user root, group root Not trusting file /home/me/.../working-copy/.hg/hgrc from untrusted user root, group root 7648'! The message area says: Running hg annotate -d -n --follow -r... my-file.c...FAILED (status 255) I don't have anything in my .emacs related to vc or hg. Other commands, like vc-diff, work fine. What am I missing here?

    Read the article

  • Dial-in VPN Routing issue when on 192.168.x.x network range

    - by Ian
    I'm not an expert on networks but have a small office on the 192.168.x.x. range which is managed by a vigor (2800) router. I have enabled the VPN dial-in option on the router so I can get to the server on 192.168.1.100 which works fine from my macbook when i'm NOT on a local network with that is on the 192.168.x.x range. e.g. works fine when I tether over my Android smartphone but when I try & connect when on my home network, it connects, I can access the router (192.168.1.1) but cannot access 192.168.1.100 - traceroute doesn't hop via 192.168.1.1 I have enabled "send all traffic over VPN connection but again, not joy... Feels like the osx platform isn't routing the traffic out to the vpn endpoint as the destination address is on the local subnet but expect it would be. This work fine on a windows PC on the same home network. Any thoughts on what the issue could be?

    Read the article

  • Automatic syncing USB HD drive to local HD (different computers)

    - by luri
    I have a desktop PC at work, and a laptop at home.... or elsewhere. What I want to do is use a USB HD to store my documents (about 130 Gb, maybe more). That would serve as backup and also port my files to my laptop. I'd like any of both computers to automatically sync all files there with local copies, so that I can work at either of them and keep updated copies of everything in both (plus the USB drive, which would allow me to work in other computers, too, apart from being another backup). Dropbox ins't a solution for me, due to pricing and 100Gb limit. The workflow would be as follows, to clear things up: I work on PC1. Changes in files are automatically synced to USB whenever a file is modified. I go home and boot PC2. I plug the usb drive and local files are synced (if changed) with the most recent usb copies. While I work at PC2, again, changes in files spread to my USB drive. Whenever I go to PC1 again, I plug my usb and again everything synces. So the questions would be: a) Am I crazy? b) Can it be done? c) Will I have any file conflicts (provided I'm the only one that will modify the files)?

    Read the article

  • Graduating soon with a computer science degree, but have unique circumstances [closed]

    - by Donnie
    I joined the Navy in 1998, and was admitted into Nuclear Power Training. I got my electrician's mate certificate, but was put on medical hold when I was in Nuclear Power Training. I was sent to the Naval Hospital, and received a medical (honorable) discharge in the middle of 2000. I decided to stay at home and raise my son, and my girlfriend worked. a few years ago, I decided that I want to work as a programmer, so I went to college and will soon be graduating with a degree in computer science. I hope to finish with a relatively high GPA, 3.8 or 3.9. My question is this: How much, if any, of my Navy experience should I put on my resume? And how do I explain my nine year gap as a stay at home dad? Do I even try to explain it? I know recent college graduates typically have no experience, but obviously I'm not the typical college graduate. Will my long absence from working, or my relatively short duration in the Navy hurt my chances? Should I just put the college on my resume, and hope that HR thinks I'm younger than I am? Obviously, then, my age would show at the interview and there would be questions. Any help is appreciated.

    Read the article

  • greengeeks drupal install imagemagik 'path /usr/bin/convert' does not exists error

    - by letapjar
    I just signed up with greengeeks. I have a drupal install (6.19) on my public_html directory. The ImageMagic Toolkit can't find the binary - the error I get is "the path /usr/bin/convert" does not exist. when I use a terminal and do 'which convert' it shows /usr/bin/convert also, I have a second drupal install in an addon domain - it's home directory is above the public_html directory (in a directory called '/home/myusername/addons/seconddomain') The drupal install in the addon domain finds the imagemagick binary just fine. I am at a total loss as to why the original install cannot find the binary. The tech support guys at greengeeks have no clue either. Any ideas of things to try?

    Read the article

  • Disk full, how to move mysql database files?

    - by kopeklan
    my database files located in /var/lib/mysql which located in partition /dev/sda5 this partition is full (refer here for details) so I'm going to move the location of database files from /var/lib/mysql to /home/lib/mysql What is the right way to move this database files? Im going to do this steps: Stop http server and PHP Change datadir=/var/lib/mysql to become datadir=/home/lib/mysql in /etc/my.cnf move all database files to the new location run killall -9 mysql, then /etc/init.d/mysqld start Start http server and PHP Is this right? Correct me if I'm wrong added: currently, mysql won't stop. refer here: mysql wont stop, mysqld_safe appeared in top

    Read the article

  • Do you tend to write your own name or your company name in your code?

    - by Connell Watkins
    I've been working on various projects at home and at work, and over the years I've developed two main APIs that I use in almost all AJAX based websites. I've compiled both of these into DLLs and called the namespaces Connell.Database and Connell.Json. My boss recently saw these namespaces in a software documentation for a project for the company and said I shouldn't be using my own name in the code. (But it's my code!) One thing to bear in mind is that we're not a software company. We're an IT support company, and I'm the only full-time software developer here, so there's not really any procedures on how we should write software in the company. Another thing to bear in mind is that I do intend on one day releasing these DLLs as open-source projects. How do other developers group their namespaces within their company? Does anyone use the same class libraries in personal and in work projects? Also does this work the other way round? If I write a class library entirely at work, who owns that code? If I've seen the library through from start to finish, designed it and programmed it. Can I use that for another project at home? Thanks, Update I've spoken to my boss about this issue and he agrees that they're my objects and he's fine for me to open-source them. Before this conversation I started changing the objects anyway, which was actually quite productive and the code now suits this specific project more-so than it did previously. But thank you to everyone involved for a very interesting debate. I hope all this text isn't wasted and someone learns from it. I certainly did. Cheers,

    Read the article

  • accessing live usb files from new hd ubuntu install

    - by Robin Bailey
    After my live USB (ubuntu 12.04 lts) refused to boot, I proceeded to install the same Ubuntu version on the laptop hard drive (a dual boot next to Win xp). This all went well without a hitch. Previous to this, I spent several weeks enjoying and exploring ubuntu from the usb pendrive. During this time I changed lots of settings and customized Firefox and more. Now, I'd like to import the home folder from the usb drive into the new install home folder on the hard disk, which is the purported folder that holds all those special settings to my knowledge. Unfortunately and only being familiar with Windows file systems, the view of the usb file system from the new hdd install is totally perplexing. I can't find anything that looks anywhere close to the original file system. More, I can't find any of the files I had created and stored there, like the LibreOfficeCalc file that has all my passwords (this one is really discouraging) that was stored on the ubuntu desktop. Help me find this file alone and I'll bow down with full apologies to any and all computer gods. Being able to import all those customizing settings into the new install would be a major bonus also, but hey, I'm not greedy. I'll take the passwords file and be happy! And humble! I would be very grateful for some clear, understandable help on this. Thanks

    Read the article

  • Is there a way to get att.net email to stay connected?

    - by Clay Shannon
    My att.net account at home (wireless connection) has been bad for the last several days: I have to hit F5 quite a few times to "unfreeze" it (I can read an email or two, then it freezes, etc.). At work (company LAN) it's even worse: I can connect to the site and see that I have email, but can't open any of the emails - and the screen constantly refreshes (every couple of seconds) with a "Connecting..." message. It apparently connects and disconnects over and over again, but never stays connected long enough to actually access the email. Is there a way either to fix this OR forward my att.net (from home) to my work email address (accessible via MS Outlook)? Or set it up from work using Outlook to pull in my att.net email? I have Outlook 2003 at work.

    Read the article

  • OpenWorld 2011 Video Index

    - by Chris Kawalek
    We did quite a few virtualization videos this year at Oracle OpenWorld 2011. You can find all these and more on our YouTube channel. Virtualization Wrapup Adam Hawley discusses the Oracle virtualization presence at Oracle OpenWorld 2011. http://www.youtube.com/oraclevirtualization#p/f/2/53_SQYljqN4 Oracle Applications on iPad Brad Lackey shows how you can access Oracle Applications on iPad. http://www.youtube.com/oraclevirtualization#p/f/9/3Ug5km3uxEQ Thinkquest.org and Oracle VM Dan Herrup describes how Thinkquest.org is using Oracle VM to help kids learn how to solve real world problems with computer technology. http://www.youtube.com/oraclevirtualization#p/f/6/Bw-km5kqzEo Avaya and Oracle Virtualization See Oracle desktop virtualization in action at Avaya's booth. http://www.youtube.com/oraclevirtualization#p/f/4/xIHRIijEPkM Eco-Features of Sun Ray Clients Michael Dann shows off the Sun Ray 3 Plus and talks about the eco benefits of Oracle's extremely low power consumption client device for desktop virtualization. http://www.youtube.com/oraclevirtualization#p/f/3/ulArHGe1OmM Application and Desktop Access with Oracle Secure Global Desktop Watch Jeff Harvey do a quick demo of Oracle Secure Global Desktop accessing Oracle Applications. http://www.youtube.com/oraclevirtualization#p/f/5/g_ikA7dwh0g Oracle VM VirtualBox for VDI Andy Hall describes how enterprises leverage Oracle VM VirtualBox as part of their VDI deployments. http://www.youtube.com/oraclevirtualization#p/f/8/WmkeYlzgnZ8 TechCast Live: The Coolest Virtualization Products Interview with Andy Hall about the desktop virtualization portfolio. http://www.youtube.com/oraclevirtualization#p/f/7/VMkrAhZ83AA

    Read the article

  • Putting servers inside a refrigerator? [closed]

    - by Muhammad Jamal Shaikh
    It could be a silly question, but I decided to go for it. I shall be buying 3 servers in the next few weeks to set up a small webfarm at my home. I am told by different people who work in server rooms, that I should keep my servers in an air conditioned room. Which is really expensive, because the temperature here in south asia is between 10 to 50 degrees C. Here comes the funny part: I have an extra fridge in my home, why shouldn't I put the servers inside that fridge? Benefits: I don't have to buy the air conditioner. I don't have to buy the rack mount for the servers. The electricity consumed by the fridge is much much less than AC. Give me your suggestions!

    Read the article

  • Resolving host names to their domain name in an internal BIND domain

    - by Adam Plumb
    I'm setting up a domain on my home network for learning purposes, using BIND on CentOS to act as the name server. I've got the name server up and running as type master for my internal domain (plumbnicoll.family), and can do forward and reverse lookups from other computers in my LAN. For example, host office2.plumbnicoll.family correctly returns office2.plumbnicoll.family has address 192.168.1.3. What I'd like is to be able to resolve just office2 to its address, without needing to put .plumbnicoll.family at the end. Is this possible, or even desirable to do? I'm running a mixed environment at home with both Linux and Windows computers.

    Read the article

  • cURL looking for CA in the wrong place

    - by andrewtweber
    On Redhat Linux, in a PHP script I am setting cURL options as such: curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, True); curl_setopt($ch, CURLOPT_CAINFO, '/home/andrew/share/cacert.pem'); Yet I am getting this exception when trying to send data (curl error: 77) error setting certificate verify locations: CAfile: /etc/pki/tls/certs/ca-bundle.crt CApath: none Why is it looking for the CAfile in /etc/pki/tls/certs/ca-bundle.crt? I don't know where this folder is coming from as I don't set it anywhere. Shouldn't it be looking in the place I specified, /home/andrew/share/cacert.pem? I don't have write permission /etc/ so simply copying the file there is not an option. Am I missing some other curl option that I should be using? (This is on shared hosting - is it possible that it's disallowing me from setting a different path for the CAfile?)

    Read the article

  • Why can't I run any Android NDK commands?

    - by TheBuzzSaw
    I had been running Mint 12 before, and everything was working there. I switched to Ubuntu 12.04, and now I am very frustrated. When I run ndk-build, I get /home/buzz/ndk/prebuilt/linux-x86/bin/make: not found So, I changed to that folder directly. When I type in ./make, I get bash: ./make: No such file or directory Typing ls clearly shows the file where I am! I did some hacking around (pointing to external tools) to get past each error (just to experiment), and I ran into this! /home/buzz/ndk/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc: Command not found Why? Why are all these files unable to be found? As I said above, this was all working just fine in another distro. What changed? What's extra frustrating is that if I push TAB to auto-complete, it works. So, the file is clearly there (and clearly marked with execution permissions). So, why can't it be found?

    Read the article

  • Network with bridge and port forwarding?

    - by rafek
    Hi! Below is my current (and planned) home network configuration. I would like to connect my non-wifi-capable desktop to my home network. The question is: HOW? What device do I need? The primary requiremen is that I need to be able to forward ports to my desktop. How would I achieve this? Is there something like "double port forwarding"? Could anyone please explain this configuration to me? Thank you in advance!

    Read the article

  • What technical details should a programmer of a web application consider before making the site public?

    - by Joel Coehoorn
    What things should a programmer implementing the technical details of a web application consider before making the site public? If Jeff Atwood can forget about HttpOnly cookies, sitemaps, and cross-site request forgeries all in the same site, what important thing could I be forgetting as well? I'm thinking about this from a web developer's perspective, such that someone else is creating the actual design and content for the site. So while usability and content may be more important than the platform, you the programmer have little say in that. What you do need to worry about is that your implementation of the platform is stable, performs well, is secure, and meets any other business goals (like not cost too much, take too long to build, and rank as well with Google as the content supports). Think of this from the perspective of a developer who's done some work for intranet-type applications in a fairly trusted environment, and is about to have his first shot and putting out a potentially popular site for the entire big bad world wide web. Also, I'm looking for something more specific than just a vague "web standards" response. I mean, HTML, JavaScript, and CSS over HTTP are pretty much a given, especially when I've already specified that you're a professional web developer. So going beyond that, Which standards? In what circumstances, and why? Provide a link to the standard's specification.

    Read the article

  • Ubuntu Boot Slow - Late Drive Access

    - by Mo2
    So I just installed Ubuntu on a brand new Samsung 840 Pro SSD and have noticed that it is slower than expected. The LED access light doesn't start blinking until after 20 seconds or so have passed, after which the boot up seems to actually start. Does anyone have any ideas as to why this is happening? I have Windows 7 installed on a separate SSD and use Windows Loader as the main loader. Windows Loader points to GRUB2, from which I then start Ubuntu up. The 20 seconds count that I mentioned earlier starts AFTER I select Ubuntu 14.04.1 from the GRUB2 menu. Any help is appreciated. EDIT: I forgot to mention that I have only one partition for / and /home. I intend on using a shared NTFS drive for my documents and other personal files, so I had no need for the /home. I did not make a /swap since I have 12 GB of RAM. I also did not make a /boot since I was told that it's not really necessary in my case.

    Read the article

  • glusterfs to replicate files to other servers

    - by sbrattla
    I've got multiple servers which all need to have the same content in /home. In other words, if the file /home/user1/test.txt is updated on server A, this needs to be replicated to all other servers in the cluster. Is it possible to use GlusterFS for this purpose? That is, let each server have a full copy of all data locally - which that server will be working on - and solely use GlusterFS to take care of replicating this data to the other servers? I'm not intersted in a combined storage, but rather have all data on all machines only to have GlusterFS to replicate it to the other machines.

    Read the article

  • Are VLANs necessary for my environment?

    - by kleefaj
    Greetings. I'm the new network manager for a school. I've inherited an environment made up of several Windows servers, about 100 Windows clients, ten printers, one Cisco router, six Cisco switches, and 1 HP switch. Also, we're using VoIP. There are four floors in our building. The hosts on each floor are assigned to a separate VLAN. An office on the first floor has its own VLAN. All the switches are on their own VLAN. The IP phones are on their own VLAN. And the servers are on their own VLAN. For the number of hosts on the network, are all these VLANs really buying me anything? I'm new to the VLAN concept but it seems overly complicated for this environment. Or it's genius and I just don't get it. Any thoughts? Thanks, Jeff

    Read the article

  • Best practice, or generally best way to set up web-hosting server, permissions, etc.

    - by Jagot
    Hi, I'm about to set up a server upon which a friend and I will be hosting web sites, and I'll be using Debian. I've set up a LAMP solution many times just to using for local testing purposes, but never for actual production use. I was wondering what are the best practices are in terms of setting the server up, in reference specifically to accessing the web root directory. A couple of the options I have seen: Set up a single user account on the server for us both to use and use a virtual host to point to the somewhere in the home directory, e.g. /home/webdev/www. Set each of us up a user account, and grant permissions in some way to /var/www (What would be the best way? Set up a new group?) I want to get this right when I first set this up as there won't be any going back for a while once our first site is up and running. Appreciate any guidance in advance.

    Read the article

  • Offset AND incremental backup

    - by Pyrolistical
    I already do backups from my main computer to my server computer using synctoy. But now I also want to do off-site backup. My idea so far: have source hard drive (we'll call S) at home have backup hard drive at work called B have transport hard drive called T connect T at work and record index of files on B take T home and check index of S and note new/changed/deleted files and copy changed files to T take T to work and update S repeat Its basically a sneakernet and using all of the advantages of it. High bandwidth, low latency. Is there some software to do this, or do I have to write it myself?

    Read the article

  • Is there a way to only require password on waking from sleep and not on screensaver (In Snow Leopard

    - by Vitaly Kushner
    I hate it when it asks me for password when I'm at home getting away from a computer for a while. I do like it having a screensaver though. But for some reason I see that password settings for the screensaver is merged with the password settings for waking from sleep. And waking password is an essential security feature for me. Essentially when Im not in a secured environment I close the lid when going away from the laptop even for a minute, but at home I want it to stay open. Is there a way to have it ask for password only after sleep and not after screensaver?

    Read the article

  • How can I stop a process from moving to the background?

    - by Alex
    I have a machine running Ubuntu server version 12.04.3 LTS. On it, I'm attempting to run a node.js server that needs to stay up and running at all times. I'm running into an issue, however, where periodically I see this happen: [1]+ Stopped sudo node server.js When this happens, I have to manually bring it back with fg, which works fine, at least until it stops again. As far as I can tell, it isn't functioning properly while stopped, since I get no log files in those windows of time. So my question is this: Is there a way to prevent it from being stopped like that? I'm running it in a tmux window, if that changes anything. Also, to address the question before it gets asked: I'm running it as sudo due to some ecryptfs issues I've been having. I was originally running it in my home directory, but when it was left alive for too long things would get out of sync and the file writes it has to do would just stop working. To mitigate that, I moved it out of my home directory, but its new location requires me to use sudo permissions for everything to work correctly. Hopefully that isn't related to the whole background task thing. (sudo and tmux tags included in case one or both turn out to actually be relevant to the solution.)

    Read the article

  • How to make a server check it's own availability on the web?

    - by Javawag
    Hi all, Just a quick question – my server is running at my house serving www pages at www.javawag.com. The problem is that my home internet connection keeps dropping randomly - for about 10 mins at a time. This is only an intermittent problem and will go away soon I hope. However, my server doesn't recover properly - when the connection comes back, I can still access it at 192.168.0.8 (locally) without any issue, but at www.javawag.com there's no reply! (Just an aside - my home internet connection is dynamic ISP, the domain www.javawag.com points to javawag.dyndns.org which in turn points to my IP, updated every minute by ddclient on the server) Is there some way for the server to check if it's accessible from the outside world periodically, and if not restart Apache/reboot? Oh, and if I reboot the problem fixes itself also! Javawag

    Read the article

< Previous Page | 203 204 205 206 207 208 209 210 211 212 213 214  | Next Page >