Search Results

Search found 19586 results on 784 pages for 'machine instruction'.

Page 487/784 | < Previous Page | 483 484 485 486 487 488 489 490 491 492 493 494  | Next Page >

  • With dnsmasq as the DNS server, 'dig' and 'ping' succeed while 'nslookup' fails

    - by einpoklum
    I installed dnsmasq on a machine of mine (It's a Kubuntu 12.04 LTS), backed only by /etc/hosts (no connection to the Internet until later). Now, if I dig mymachine, I get 192.168.0.1, but if I try to nslookup mymachine, I get: >> connection timed out; no servers could be reached Tried also nslookup mymachine.mynicedomain.org - didn't work either. pinging (Edit:) succeeds. This happens both on the server machine itself and on other machines on the network. How can I the DNS lookups to work? What problem is preventing nslookup from succeeding? Additional Information In the server's /etc/hosts: 192.168.0.1 mymachine In the server's nsswitch.conf: hosts: files mdns4_mininal [NOTFOUND=return] dns mdns4 (admittedly, this is a bit weird; but I also tried: hosts: files dns instead, with the same effect) In resolv.conf (which is generated by dnsmasq): nameserver 127.0.0.1 search mynicedomain.org In the server's /etc/hosts.allow: domain: ALL In the other machines' /etc/resolv.conf (this is set by the DHCP client): nameserver 192.168.0.1 search mynicedomain.org Relevant netstat output on the server: Proto Recv-Q Send-Q Local Address Foreign Address State tcp 0 0 127.0.0.1:53 0.0.0.0:* LISTEN tcp 0 0 192.168.0.1:53 0.0.0.0:* LISTEN Finally, here's the ipconfig output from one of the client machines on the network (running Windows 7): Connection-specific DNS Suffix . : mynicedomain.org Description . . . . . . . . . . . : Intel(R) 82579LM Gigabit Network Connection Physical Address. . . . . . . . . : 12-34-56-78-9A-BC DHCP Enabled. . . . . . . . . . . : Yes Autoconfiguration Enabled . . . . : Yes IPv4 Address. . . . . . . . . . . : 192.168.0.50(Preferred) Subnet Mask . . . . . . . . . . . : 255.255.255.0 Lease Obtained. . . . . . . . . . : Sunday, October 20th 2013 16:20:25 Lease Expires . . . . . . . . . . : Sunday, October 20th 2013 18:20:24 Default Gateway . . . . . . . . . : 192.168.0.1 DHCP Server . . . . . . . . . . . : 192.168.0.1 DNS Servers . . . . . . . . . . . : 192.168.0.1 NetBIOS over Tcpip. . . . . . . . : Enabled Notes: May be related to this question.

    Read the article

  • Resetup kernel for virtualbox and now ubuntu is in initramfs state

    - by UbuntuMan
    My 10.04LTS became Read-Only so I wanted to reboot this virtual machine. Then I couldn't get it back up running. VirtualBox threw Result Code: NS_ERROR_FAILURE (0x80004005) error. I did the kernel re-setup: sudo /etc/init.d/vboxdrv setup * Stopping VirtualBox kernel modules [ OK ] * Uninstalling old VirtualBox DKMS kernel modules [ OK ] * Trying to register the VirtualBox kernel modules using DKMS [ OK ] * Starting VirtualBox kernel modules [ OK ] I can't even use sudo in this state. What can I do? (initframfs) /dev/sda1 /bin/sh: /dev/sda1: Permission denied (initframfs) /dev/sda2 /bin/sh: /dev/sda3: Permission denied (initframfs) /dev/sda3 /bin/sh: /dev/sda1: not found I have a similar image so I can check the disks setting if needed. Please help me. Thanks. I have an older version of VirtualBox: 4.0.8 something like that. But other vms on the same VirtualBox are working fine. UPDATE I can hold down SHIFT key and see the GURU menu. Only one kernel exists. Ubuntu(...) Ubuntu(recovery) master test master test

    Read the article

  • How is my password sent across when I check gmails/access bank site [closed]

    - by learnerforever
    What encryption is used when my password is sent across in gmails/when I do online banking? RSA? DSA? Public-private key encryption?. In key encryption, which entity is assigned a public/private key? Does each unique machine with unique MAC address has a unique public/private key? Does each instance of browser have unique key? Does each user have unique private/public key? How does session key come into picture? How do machines receive their keys?

    Read the article

  • IIS7.5 Windows Authentication missing providers menu item (ntlm)

    - by Alex Bilbie
    I'm trying to enable NTLM authentication on a Windows Server 2008 R2 machine with IIS 7.5 for a specific file in my web root. I've been following these instructions http://docs.moodle.org/en/NTLM_authentication#IIS_Configuration In the IIS Manager I open the Authentication module, disable anonymous authentication and enable Windows Authentication however according to every post I can find on the matter I should have a 'providers' option appear but I don't. I've double checked in Server Manager that the 'Windows Authentication' security feature is enabled for IIS. Any help anyone can offer would be great, Thank you!

    Read the article

  • Recovering a lost website with no backup?

    - by Jeff Atwood
    Unfortunately, our hosting provider experienced 100% data loss, so I've lost all content for two hosted blog websites: http://blog.stackoverflow.com http://www.codinghorror.com (Yes, yes, I absolutely should have done complete offsite backups. Unfortunately, all my backups were on the server itself. So save the lecture; you're 100% absolutely right, but that doesn't help me at the moment. Let's stay focused on the question here!) I am beginning the slow, painful process of recovering the website from web crawler caches. There are a few automated tools for recovering a website from internet web spider (Yahoo, Bing, Google, etc.) caches, like Warrick, but I had some bad results using this: My IP address was quickly banned from Google for using it I get lots of 500 and 503 errors and "waiting 5 minutes…" Ultimately, I can recover the text content faster by hand I've had much better luck by using a list of all blog posts, clicking through to the Google cache and saving each individual file as HTML. While there are a lot of blog posts, there aren't that many, and I figure I deserve some self-flagellation for not having a better backup strategy. Anyway, the important thing is that I've had good luck getting the blog post text this way, and I am definitely able to get the text of the web pages out of the Internet caches. Based on what I've done so far, I am confident I can recover all the lost blog post text and comments. However, the images that go with each blog post are proving…more difficult. Any general tips for recovering website pages from Internet caches, and in particular, places to recover archived images from website pages? (And, again, please, no backup lectures. You're totally, completely, utterly right! But being right isn't solving my immediate problem… Unless you have a time machine…)

    Read the article

  • Windows - Delayed Write Failed error on USB hard drive

    - by ndngrd
    I've got a new Verbatim 1.5TB USB hard drive (Samsung HD154UI) and I'm finding myself completely unable to fill it. I'm using Windows XP. Whenever I try to copy a load of files over, it works for some time (will copy over between 20 and 90GB) but eventually stops with an error saying "The specified path is too deep" - the specified is not too deep, there's nothing more than 2 dirs deep that I'm copying. A balloon pops up at the bottom saying "Windows - Delayed Write Failed" telling me the data could not be copied. This wouldn't be too bad if I could just restart the transfer, but after this error has happened I can't write anything else to the disk - including if I eject it and then connect it to another machine. It just seems completely locked. The only way I can unlock it is to delete everything that I was copying to it. I've tried various USB cables and copying from different machines, and the same thing keeps happening.

    Read the article

  • Internet very slow when upgrading to Ubuntu 9.10

    - by roojoo
    I was running Ubuntu 8.x on my desktop and everything worked fine. Im using wired internet and it worked perfectly, pages loaded pretty fast. However, when I decided to upgrade to 9.10 the upgrade failed at some point, however I was left with what appeared to be Ubuntu 9.10. Since then the internet has been weird. When I go to a website it takes at least 10 seconds for the page to display, however if Im on a site and navigate to other pages on the website it loads quickly. This never happened prior to the upgrade. I thought this may be due to the upgrade not installing correctly so I did a fresh install of Xubuntu 9.10 but the problems are still the same. Im writing this on a Vista machine over the wireless network and internet is fine. Does anyone have any ideas of the issue? Thanks.

    Read the article

  • SVN check out to samba directory

    - by Jon H
    I'm trying to svn co to a directory on Ubuntu, shared via samba, to OS X, but I get the following error (in OS X). svn: In directory 'site/product/tests' svn: Can't open file 'site/product/tests/.svn/tmp/text-base/._base.py.svn-base': No such file or directory My smb.conf file includes the following changes: unix extensions = no browseable = yes public = yes writable = yes delete readonly = yes create mask = 0775 directory mask = 0775 valid users = %S read only = no The checkout works fine locally (on the Ubuntu machine). What am I missing? More detail: Later inspection showed that the svn error couldn't find the file with 3, then 2 underscores: .___init__.py.svn-base Whereas listing the directory in OS X showed 2, then 2 underscores: __init__.py.svn-base And listing the same directory in a successful checkout on Ubuntu shows nothing (because it's a temporary directory?) I've tried the mangled = no setting in share settings, to no effect.

    Read the article

  • Stopping local drive mappings from transfering to a RDP session

    - by Chad
    We have a SQL server that locally has about 6 physical drives mapped. However let's say G: is a mount point to the SAN, if I connect with my local machine and have a personal folder mapped locally as g:\userdata that transfers to the remote desktop session on the server overwriting the value of the 'NAME' of the share. Here is the kicker, the G: on the server still has the right information but has the wrong label coming from the share on my PC. Does anyone know how to prevent this from happening? My tick box for local resources is unchecked in my Microsoft RDP client.

    Read the article

  • Backup software for Windows Server 2008 R2 Enterprise with 4 virtual machines (Exchange, SQL, AD, SharePoint)

    - by MadBoy
    What are the options for backup software for: HOST - Windows Server 2008 R2 Enterprise with HyperV VIRTUAL - Windows Server 2008 R2 Enterprise with Exchange 2010 VIRTUAL - Windows Server 2008 R2 Enterprise with SQL Express / SharePoint VIRTUAL - Windows Server 2008 R2 Enterprise with Terminal Services (10 users working on it) VIRTUAL - Windows Server 2008 R2 Enterprise with AD/DNS What I'm looking at is possibility of having an offsite backup thru FTP, maybe copy to usb/esata/lan drives for easy taking backup data outside of company. What I've been looking at: - Symantec Exec Backup 2010 System Recovery has an offsite backup but I would need 5 licenses and it doesn't have granular recovery. - Symantec Exec Backup 2010 seems OK but a bit expensive - Microsoft DPM 2010 requires full SQL Standard and for each machine I would need 4 Enterprise licenses. But does it allow Offsite backup without need for additional license and server outside of company (for doing DPM backup of DPM). What other options? This is 10 people company and so the costs matter but also convenience and security. Offsite backup is requirement.

    Read the article

  • remote desktop over wireless

    - by tbischel
    So I'm trying to run remote desktop on my laptop to connect to my home desktop. I have a problem where this works fine if I connect my laptop with an ethernet cable, but fails when I try to use wireless internet access (which works fine for normal internet surfing). I've experienced this problem at home with my wireless router, and at work with the wireless network they have there, so I'm inclined to believe that its a setting local to my machine rather than the router blocking the requests... but I'm not sure where to look. Any suggestions?

    Read the article

  • Transferring existing files from ext3 to ZFS (on FreeBSD)

    - by peppergrower
    I use an old machine as a file server, for backups, and as a testbed for development. I currently have Debian installed, but I'm very interested in FreeBSD because of ZFS: I really, really like its file integrity features. Before I switch, however, I wanted to ask: what's the best way to migrate my ~400GB of files from the current filesystem (ext3) to ZFS? My number-one requirement is that the migration be absolutely reliable: I don't want to lose any data. (I'll be backing it up before doing this anyway, but still.) My secondary goal is speed: I'd rather not have this take overnight if it doesn't have to. Recommendations? Is FUSE for FreeBSD stable enough to use? What about FreeBSD's native read support for ext3? NFS, maybe? How have you done this?

    Read the article

  • How do I work around sudo 'segmentation fault' on basic bash commands?

    - by sage
    I am sure the answers are out there, but alas there are too many answers (here and elsewhere) to other questions stopping me from finding them. I just encountered something substantially similar to what is described at the closed SO question, sudo : “segmentation fault” Ubuntu maverick [closed]. My team is using Ubuntu 11.04 on VMWare Workstation 8.0.4. We are doing development using c++, Xenomai, Qt, and Qt Creator. When we simulate our application on the virtual machine, we currently need to launch Qt Creator with sudo. My colleague mentioned today that he has been having issues where his workstation locks up and he needs to restart and that occasionally he has the issue that all sudo bash commands return "segmentation fault". I just ran our application in simulation mode. I was running Qt Creator under sudo and Qt Creator received the signal abort (if I recall). Afterward, every command executed with sudo from sudo qtcreator to sudo ls resulted in the message Segmentation fault. I clicked on the power widget to see if I could log out, but the system shut down straightaway without prompting. My understanding is that we run sudo because of a permissions issue with Xenomai and the VM as currently configured, but my colleague has a workaround for this. I expect that not running Qt Creator under sudo -- something that has always made me nervous -- will help contain this issue, but I find it troubling that this could happen and manifest as it does. Does anyone know what is happening? Any recommendations on how to work around this issue? This is happening often to I am trying tolobby for VM changes to be able to run the process without sudo.

    Read the article

  • MySQL: "UPDATE command denied to user ''@'localhost'"

    - by Uncle Nerdicus
    For some reason when I installed MySQL on my machine (a Mac running OS X 10.9) the 'root' MySQL account got messed up and I don't have access to it, but I do have access to the standard MySQL account 'sean@localhost' which I use to log into phpMyAdmin. I am trying to reset the 'root' password by starting the mysqld daemon using the command mysqld --skip-grant-tables and then running the following lines in the mysql shell. mysql> UPDATE mysql.user SET Password=PASSWORD('MyNewPass') -> WHERE User='root'; mysql> FLUSH PRIVILEGES; Problem is when I try to run that MySQL string the daemon spits back a ERROR 1142 (42000): UPDATE command denied to user ''@'localhost' for table 'user' as if I didn't use the -u argument when I started the mysql shell, either though I did. Any help is muchly appreciated as I am lost at this point. :/

    Read the article

  • Distributed transactions and queues, ruby, erlang

    - by chrispanda
    I have a problem that involves several machines, message queues, and transactions. So for example a user clicks on a web page, the click sends a message to another machine which adds a payment to the user's account. There may be many thousands of clicks per second. All aspects of the transaction should be fault tolerant. I've never had to deal with anything like this before, but a bit of reading suggests this is a well known problem. So to my questions. Am I correct in assuming that secure way of doing this is with a two phase commit, but the protocol is blocking and so I won't get the required performance? It appears that DBs like redis and message queuing system like Rescue, RabbitMQ etc don't really help me a lot - even if I implement some sort of two phase commit, the data will be lost if redis crashes because it is essentially memory-only. All of this has led me to look at erlang - but before I wade in and start learning a new language, I would really like to understand better if this is worth the effort. Specifically, am I right in thinking that because of its parallel processing capabilities, erlang is a better choice for implementing a blocking protocol like two phase commit, or am I confused?

    Read the article

  • Ghosting context menu clicks in WinXP

    - by Swish
    Let me preface by saying I have a lot of windows open most of the time, although not resource intensive ones, just browsers, ssh sessions, a music player, FTP client, Notepad++, IM clkients, etc. Anyway, I get a lot of weird visual "ghosting" type effects. For example when right-clicking and then selecting an option from a context menu the selected item will remain in view until I right click somewhere on the desktop. Same thing happens when selecting items from the File, Edit, etc. menu in various programs. I'm assuming this is just a result of a less than high quality video card (NVIDIA GeForce FX 5200), all the other hardware in the machine is newer higher quality, that specific video card was added after the fact for multiple monitors. I have looked all over the web for solutions and have increased the number of GDI handles for Windows, reduced the hardware accelaration on the card, etc. Any suggestions other than replace the card?

    Read the article

  • How to burn a CD-ROM from a Mac for Solaris

    - by cope360
    I would like to burn a CD using a Mac (10.5) which I can then access from a Solaris 10 x86 machine. This partially works: Insert blank CD and let the Finder open it so it creates a "Recordable CD" window for it. Drag the files to be burned into the "Recordable CD" window Burn (there are no options except for speed) Then to mount in Solaris: mount -f hsfs -o ro /dev/dsk/foo /mnt/bar The problem here is that Solaris will see all the filenames as lower case and it will only allow each file name to contain one period (these are HSFS limitations). My guess is that the Mac is not burning the disc with the Rock Ridge extensions that allow for the full file names to be preserved. Is there some combination of burning tools/options and mount options that will make this work?

    Read the article

  • How do I deploy a charm from a local repository?

    - by Matt McClean
    I am trying to run the Charm tutorial from the juju documentation by creating a new charm from a local repository. I started by installing the charms from bzr to my local ubuntu 12.04 desktop running in a virtual machine. The new file structure is the following: ubuntu@ubuntu-VirtualBox:~$ find charms/precise/drupal/ charms/precise/drupal/ charms/precise/drupal/hooks charms/precise/drupal/hooks/db-relation-changed charms/precise/drupal/hooks/install charms/precise/drupal/hooks/start charms/precise/drupal/hooks/stop charms/precise/drupal/metadata.yml charms/precise/drupal/README When I install the mysql charm, which was downloaded from the remote charm repository, it works fine. However when I run the following command to deploy the new charm it fails with the following error message: ubuntu@ubuntu-VirtualBox:~$ juju deploy --repository=charms local:precise/drupal 2012-05-09 10:01:05,671 INFO Searching for charm local:precise/drupal in local charm repository: /home/ubuntu/charms 2012-05-09 10:01:05,845 WARNING Charm '.mrconfig' has an error: CharmError() Error processing '/home/ubuntu/charms/precise/.mrconfig': unable to process /home/ubuntu/charms/precise/.mrconfig into a charm Charm 'local:precise/drupal' not found in repository /home/ubuntu/charms 2012-05-09 10:01:06,217 ERROR Charm 'local:precise/drupal' not found in repository /home/ubuntu/charms Is there some file missing in the drupal charm directory that juju needs to make the charm valid? Also, I get the file processing error for the .mrconfig file also when deploying the mysql charm so is there something I need to change there perhaps?

    Read the article

  • Only two monitor ports are detected on nVidia Quadro NVS 450

    - by Erick Robertson
    I have an nVidia Quadro NVS 450 installed in a Dell Optiron 380. Only DisplayPort #1 and DisplayPort #4 are detected by windows. The machine has a BIOS setting to automatically choose the primary video card, or to disable the primary when a PCI-e card is installed (which it is). Windows cannot see DisplayPort #2 and #3 no matter what I do. I have tried the Windows Drivers, latest nVidia Drivers - no dice. I am assured that this video card cannot break in this way. I'm plum out of ideas. I've tried reseating the video card. I've contacted Dell and they've remoted in and looked around - threw their arms up after two hours. Any ideas? I'm running Windows 7 64-bit. All four monitors are 1600x1200 Dell monitors.

    Read the article

  • Squid traffic tunneled through VPN

    - by NerdyNick
    So what I'm trying to do is have a Squid Proxy run on 1 machine along side a VPN connection. What I want to happen is all traffic running though the Squad Proxy would run though the VPN for its outbound. ie Desktop - (Squid Proxy - VPN) The goal is to allow my desktop selective tunneling through the VPN. So that Instant Messaging and the like that do not need to run through the VPN can go through my normal traffic. Typically I would go though a SSH Proxy but currently am forced to use VPN to gain entry into the office, and a Squid proxy seemed like it might work out the easiest for what I am needing. EDIT Realize I forgot to actually state what problem I'm running into. I have the Squid setup and verified it works, but once I connect to the VPN. All requests to Squid get accepted but Squid is unable to make the request over the VPN. So the client ends up just sitting there.

    Read the article

  • Unable to access Windows 7 shared folder with Windows 98

    - by PabloG
    I'm unable to access a Windows 7 (Windows 7 Pro 64-bit) shared folder from an old Windows 98 box: I tried with: Turning on file and printer sharing Turning on public folder sharing Turning off password protected sharing Sharing the folder with read permissions to Everyone Lowering the encryption to 40-56 bits. The shared folder works fine using it from Windows XP, and even from Linux with CIFS / Samba, but when I try to use it from Win98 with: NET USE X: \\SERVER\SHARE an user / password dialog pops up. I entered the administrator's user / password from my Windows 7 box, but it doesn't work (incorrect password). The same Win98 machine works fine accessing a Windows XP shared folder, so it looks like a Windows 7 networking issue. Any ideas?

    Read the article

  • Can't enable wireless lan on Fujitsu Siemens A1665G with Ubuntu 11.10 installed

    - by Theo
    I saw my old Notebook yesterday and wanted to make that work again. On Windows XP the wireless worked still fine. Then I installed new Ubuntu 11.10 32bit and I'm sadly not able to make the wireless enabled. [I replaced Win XP entirely] lspci lists following network: 08:0a.0 Network controller: Broadcom Corporation BCM4318 [AirForce One 54g] 802.11g Wireless LAN Controller (rev 02) So after recommendation from this link I installed the b43 firmware module. iwconfig prints the following: wlan0 IEEE 802.11bg ESSID:off/any Mode:Managed Access Point: Not-Associatd Tx-Power=off Retry long limit:7 RTS thr:off Fragment thr:off Power Management: off As you can see, my wireless lan adapter is not turned on. sudo iwconfig wlan0 txpower auto Doesn't change anything. Then I tried to make it work with rfkill. rfkill list 0: phy0: Wirless LAN Soft blocked: no Hard blocked: yes sudo rfkill unblock all rfkill list 0: phy0: Wireless LAN Soft blocked: no Hard blocked: yes remains the same. The question is now, how I could enable the hard blocked wireless LAN. There is no hardware switch for wlan integrated. However there is a button to change the state. I always thought this would be software sided, but it seems to make some hardware changes as well... The wireless LED is also not blinking (as it did on windows xp) I reset bios and searched for some settings in there, but it has only a few options and nothing to do with wireless settings, nothing works here.. At last I tried to install the acer hotkeys but I was not able to manage that. I installed the acerhkgui package, but in initializing progress, it was not able to compile acer hotkeys for my machine. There was a message that asm/linkage.h was not found while compiling. Do you have any ideas what I could do to make this hard blocked stuff disabled and my wireless card work? PS: I also tried sudo rm /dev/rfkill and a reboot to reinit that stuff... No success :(

    Read the article

  • Visual Studio 2010 on Macbook Air

    - by Kyle B.
    Does anyone here run Visual Studio 2010 (or VS12 RC) on a Macbook Air? I have the current model with 4GB ram, 13" screen, and 256GB SSD drive. Before I go through the effort of configuring this, I'd like to know if anyone from the community has done this and: Was the performance acceptable? If it is, I plan to get a larger cinema display monitor as a second display and do all my coding on this machine ditching my desktop. Did you use Boot camp, Parallels, or VMWare? I feel to maximize performance that boot camp would be necessary to make the most utilization of the memory, but am not sure if this completely necessary. I'd prefer to use a VM, but wasn't sure if this was practical and would value your input before buying a license. Did you also run anything else on the Windows installation, such as SQL Server express, IISExpress, etc? Did performance lag after a certain point? Note: I would have asked this in superuser.com, but felt this applied more directly to the programming community.

    Read the article

  • Is there a rule of thumb for RAM upgrades?

    - by Retrosaur
    I'm having a hard time figuring out whether or not a certain laptop/computer's RAM can be upgraded or not. Is there a rule of thumb that determines how much max RAM one could add to a system without looking it up via external websites? A little bit of a background information: I work in computer sales at a computer electronics store, so it is virtually impossible for me to install any sort of software that would detect computer specs, and I get a lot of customers who wonder what laptop/desktop RAM upgrades usually are. Is there a certain rule that adding more RAM entails? Does it make a difference if it's a 32-bit or 64-bit machine? OS?

    Read the article

  • How to manage iowait over cifs?

    - by Silvia
    For backup purposes we have Cifs file Server running that contains encrypted containers for backing up the more sensitive data. The container is mounted with cryptsetup and loop as a local filesystem and the rsync is used for backups. Because the Cifs server is not the fastest machine ever built, running the rsync process results in an iowait on the servers running the backup which in turn drives Nagios into an email frenzy. The question is, how do reduce the iowait on the server? Configuring Nagios to not report seems more like a workaround then a solution. Stretching the backups over different time intervals is already done with little effect and spending money is also not an option because apparently, we are talking about a "non-critical system".

    Read the article

< Previous Page | 483 484 485 486 487 488 489 490 491 492 493 494  | Next Page >