Search Results

Search found 7239 results on 290 pages for 'john michael zorko'.

Page 85/290 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • How does IPv6 subnetting work and how does it differ from IPv4 subnetting?

    - by Michael Hampton
    This is a Canonical Question about IPv6 Subnetting. Related: How does IPv4 Subnetting Work? I know a lot about IPv4 Subnetting, and as I prepare to (deploy|work on) an IPv6 network I need to know how much of this knowledge is transferable and what I still need to learn. IPv6 seems at first glance to be much more complex than IPv4. So I would like to know: IPv6 is 128 bits, so why is /64 the smallest recommended subnet for hosts? Related to this: Why is it recommended to use /127 for point to point links between routers, and why was it recommended against in the past? Should I change existing router links to use /127? Why would virtual machines be provisioned with subnets smaller than /64? Are there other situations in which I would use a subnet smaller than /64? Can I map directly from IPv4 subnets to IPv6 subnets? My interfaces have several IPv6 addresses. Must the subnet be the same for all of them? Why do I sometimes see a % rather than a / in an IPv6 address and what does it mean? Am I wasting too many subnets? Aren't we just going to run out again? In what other major ways is IPv6 subnetting different from IPv4 subnetting?

    Read the article

  • Directly reading a LTO tape drive

    - by John
    On our server (M$ 2003) is it possible to directly read our LTO 4 tape drive and copy the entire ntbackup created bkf file on it to an external hard disk? (Is the tape backup even stored on a tape as a bkf file, I’m going off when we only used external usb HD’s.)

    Read the article

  • Proper use of disk to disk to tape backup using de-duplication and LTO5

    - by Michael
    I currently have ~12TB of data for a full disk to tape (LTO3) backup. Needless to say, it's now requiring over 16 tapes so I'm looking at other solutions. Here is what I've come up with. I'd like to hear the community's thoughts. Server for Disk-to-Disk BackupExec 2010 Using De-duplication Technology 20+TB worth of SATA drives LTO5 robotic library connected via SAS 1Gbps NIC connected to network What I envision is doing a full backup of my entire network which will initially take a long time over the 1Gbps NIC but once the de-duplication kicks in backups should be quick. I will then use the LTO5 to make disk to tape backups and archive those accordingly. What does everyone think? Any faster way of doing the initial full backup over the 1Gbps NIC? What will be my pain points? Is there a better way of doing what I'm trying to achieve?

    Read the article

  • How to change Windows admin password from guest user

    - by John Smiith
    How to gain access of admin account of Windows, I activated a guest user and I want to change the admin password from the command line. When I type: net user administrator password the response is System error 5 has occurred. Access is denied I am using winxp pro sp2 I am running this command from cdm.exe and I am running this command from guest user. I actually want to change my admin password from guest user.

    Read the article

  • How to deduplicate 40TB of data?

    - by Michael Stauffer
    I've inherited a research cluster with ~40TB of data across three filesystems. The data stretches back almost 15 years, and there are most likely a good amount of duplicates as researchers copy each others data for different reasons and then just hang on to the copies. I know about de-duping tools like fdupes and rmlint. I'm trying to find one that will work on such a large dataset. I don't care if it takes weeks (or maybe even months) to crawl all the data - I'll probably throttle it anyway to go easy on the filesystems. But I need to find a tool that's either somehow super efficient with RAM, or can store all the intermediary data it needs in files rather than RAM. I'm assuming that my RAM (64GB) will be exhausted if I crawl through all this data as one set. I'm experimenting with fdupes now on a 900GB tree. It's 25% of the way through and RAM usage has been slowly creeping up the whole time, now it's at 700MB. Or, is there a way to direct a process to use disk-mapped RAM so there's much more available and it doesn't use system RAM? I'm running CentOS 6.

    Read the article

  • Samba and Windows 7

    - by John Gaughan
    I built a new computer with the intention of it being primarily a home file server. Here is my setup: one desktop with Windows 7 64 HP one laptop with Windows 7 64 HP one desktop with Kubuntu 11.10 (server) The two desktops use static IPs, and I have hostnames mapped in the HOSTS files on all three systems. I have the same username/password combo on all three systems. I have been trying for a while now to set up Samba so the Windows 7 systems can see and use it. Even if I can get the server to show up, Windows is unable to log in. One of the first things I did was to enable LMv2 authentication, which this version of Samba (3.5.11) supports. The workgroup is set correctly. I can normally see the server, but cannot authenticate. Windows homegroup is turned off. Pinging between machines works fine, and the two Windows 7 systems work together flawlessly. What I am trying to do is set up Samba to use peer to peer networking using NTLM security and user-mode authentication. According to the documentation this is possible, but there are no examples that I could find. In all the googling I have done, I see a lot of people asking how to set this up but it either works for someone else and not for me (no idea what I'm missing), or it doesn't work. Has anyone gotten this to work? Is there a place I could download a smb.conf that is set up to work in this environment?

    Read the article

  • Lotus Notes wants to open some emails with web browser, rather than within Notes itself

    - by John K.
    I have a problem with Lotus notes 8.5.1 wanting to open certain emails in my web browser, rather than within Notes itself. The emails that do this, seem to have some html in the note (Not just a link, but the note itself) This is a recent upgrade from 8.0.2 which had a similar problem, but it showed all of the html formatting characters in the notes, rendering it almost unreadable. I was hoping a fresh install of Notes would fix it, but it did not. I am using the same version of Notes & Win XP at work, and I do not have the problem. I am guessing that it is a setting on my home computer and not Notes, but I have no idea what it would be.

    Read the article

  • Copy seleted row to another worksheet

    - by ???? ???????
    There are for ex. 10 rows in one worksheet. When user clicks on one row it should be presented on another worksheet. Is it possible? Any help to do it? EDIT: To clarify: In one sheet are presented for example student exam marks on first year: John 10 8 10 7 Nick 8 9 8 9 Maria 7 8 8 7 On 2nd sheet there are student informations on the second year: John 9 9 10 8 Nick 8 8 9 7 Maria 7 6 8 8 I want to have give some kind of final certificate for student so summary information should be presented on the third sheet. I doesn't need to be on click. There could be drop down list on the third sheet.

    Read the article

  • How can I get HAProxy backends to include a path

    - by Michael Neale
    When using HAProxy for virtual hosting, I can see how to use the Host from the header at the front end to decide what backend to route to. However, is it possible to make the back end be a URL which includes a path (not unlike what you would do with apache or nginx when setting up virtual hosting). http://www.techrawr.com/tag/haproxy/ - shows most of it. But what if the back ends were on the one server but with backend1 and backend2 as the servers?

    Read the article

  • How to disable SSL 2.0 on IIS 7.5?

    - by John Hoge
    I've seen this KB Article which Microsoft put out covering how to remove SSL 2.0 on IIS 7.0 and earlier, but I can't find anything advising on how to do the same on IIS 7.5. The registry keys mentioned on that KB article are no longer in the registry.

    Read the article

  • Permissions on mac for itunes library with multiple users - idea

    - by John
    I currently have a lot of music on an external drive and my itunes set up from there. However, periodically, when the external drive isn't connected, itunes will default back to the library location of my home directory user path. I don't want to mess with an external drive, as my mac HD is large enough to house the music collection. However, I have 4 family members - all with their own logins - using this same gob of music. I don't want 4 copies of the library, only one with all libraries referencing it. So, what I want to do is: 1 - move all music files to a shared directory at /Macintosh HD/users/music. I created this directory and adjusted permissions, so all four users can read and write to this directory. 2 - get all four accounts to reference this library instead of the external or local home locations I am hoping I can just check the box to keep library organized in my account, which is the admin and let itunes move it all. Then delete current libraries for each account and re-add from the new shared location. Will the itunes organization process cause permissions issues either by setting permissions to all the files access to my account only or write permissions or any other 'gotcha'? I am having a hard time coming up with a smooth solution that won't break everything and cause me to have mega duplicates or access issues. I would prefer not to do any xml library file editing if possible. Am I dreaming? Thanks for help.

    Read the article

  • DPM server 2010 Attach agent error : administrator privileges missing?

    - by Michael
    Hello, I’m hoping you would be able to help me out with this little problem I’m having. I installed DPM 2010 in our test environment to test backups on Exchange 2010 servers. The environment includes : 1xDC 2x Exchange Server 2010 1x DPM 2010 server All of these are running on Microsoft server 2008 R2 Virtual machines. The host machines are using Hyper-v. So the problem goes like this : 1- I tried to install the agents from the DPM server GUI, which failed saying I didn’t have the correct permissions. 2- So then I tried the manual installation using the commands from : the Microsoft site http://technet.microsoft.com/en-us/library/bb870935.aspx 3- The agent installation worked but when I get to attaching the agents to the DPM server it still gives me the error saying that the specified account does not have administrator rights. 4- I tried the Domain admin, users who are domain admin + local admin, single local admins. 5- I have turned off the windows firewall and made sure all the services are running. So now I’m out of ideas and really need help, the agent attach to the DPM server is the last thing that is holding me back from deploying everything to the production site. Any help would be really appreciated.

    Read the article

  • Config xampp never to cache pages from localhost

    - by Michael Mao
    Hello everyone: I am not a webmaster, and that was exactly the reason why I installed XAMPP 1.7.2 on Windows XP SP2 instead of manually configuring Apache, MySQL and PHP to cooperate each other. Now I am having to problem to disable caching pages from localhost. Some suggested just force the browser not to cache, using Firefox web developer bar or something similar; but I feel it would be better if I could configure the Apache server in XAMPP to never allow pages from localhost to be cached. I guess this is done somewhere in httpd.conf? LoadModule cache_module modules/mod_cache.so Would this module be helpful in this case? Doc here : mod_cache I am not very sure this would resolve the problem. Could anyone confirm this approach feasible? I'd like to work it out myself, given the fact that I am on the right track... Many thanks in advance

    Read the article

  • Windows XP procedurally drop packets

    - by Michael J Mulligan
    We have a need in our office network to procedurally drop incoming packets on an XP machine acting as a server. By procedurally we mean to drop a percentage of packets incoming on the XP machine from a specific IP. Asked here because these seem like server related questions. And none of us have really any idea how to execute this. Another option is to introduce intermittent latency for an incoming IP. Thank you for your help.

    Read the article

  • Moving Portable Office. Have a LAN & Phone line query?

    - by John Smith
    We are about to move a portable office that has 8 phones and 12 lan connections. They are all wired back to our main switch and Nortel bcm200 phone system which are only about 20m (65ft) away. After the move the office will be 180m (600ft) away from these. What is the maximum length of cable a digital phone line can be? I am aware that a lan connection can only be 100m (305ft) when using cat5e utp. Does this rule apply to phone connections also? If so how can I extend beyond 100m for the phones? I was going to install about three cabinets, 3 switches and 6 patch panels for the lan connections but the ideal struck me tonight that maybe I could run a fibre optic line. Would this be feasible? Any feedback on this is greatly appreciated. Thanks

    Read the article

  • Understanding Security Certificates (and thier pricing)

    - by John Robertson
    I work at a very small company so certificate costs need to be absolutely minimal. However for some applications we do Need to have our customers get that warm fuzzy not-using-a-self-signed certificate feeling. Since creating a "certificate authority" with makecert really just means creating a public/private key pair, it seems pretty clear that creating a public/private key pair FROM such a "certificate authority" really just means generating a second public/private key pair and signing both with the private key that belongs to the "certificate authority". Since the keys are signed anyone can verify they came from the certificate authority I created, or if verisign gave me the pair they sign it with one of their own private keys, and anyone can use verisigns corresponding public key to confirm verisign as the source of the keys. Given this I don't understand when I go to verisign or godaddy why they have rates only for yearly plans, when all I really want from them is a single public/private key pair signed with one of their private keys (so that anyone else can use their public keys to confirm that, yes, they gave me that public/private key pair and they confirmed I was who I said I was so you can trust my public/private key pair as belonging to a legitimate third party). Clearly I am misunderstanding something, what is it? Does verisign retire their public/private key pairs periodically so that my verisign signed key pair "expires" and I need new ones? Edit: I learned that the certificate has an internal expiration date and it also maintains an internal value stating whether it can be used to sign other certificates (i.e. sign other private/public key pairs stored as certificates). Can't I get a few (even one) non-signing certificate signed by someone like verisign that I can use for authentication/encryption without a yearly subscription?

    Read the article

  • Gateway GT5220 Boot/POST Failure

    - by John Rudy
    I have a Gateway GT5220 I'm troubleshooting. It is, in fact, the machine I just gave my father for his birthday a couple months ago. (Prior to that, it was my home PC. My home PC is now the MacBook on which I'm writing this.) Before going any further, I suspect that the answer will be, "It's worse than that, it's dead, Jim, it's dead, Jim, it's dead, Jim." At least, mobo and/or CPU. The initial symptoms were as follows: Turn on power All fans fire up (thus making it so I can't hear if the hard drive is spinning or not, nor are my hands sensitive enough anymore to feel it) No LEDs remained lit on the front panel. (Initially, the hard drive indicator flashed briefly.) No beep, no video, no nothing. Following some advice I found here, I tried to "drain the stored power." After following those steps, the new symptoms were: Turn on power All fans fire up The front panel LEDs remained lit! After about 20, maybe 30 seconds, we had video! Sort of. We got to the Gateway splash/POST screen, which appeared thoroughly corrupted. How corrupted? Well, I imagine it's what a POST screen would look like after reading the wrong passage out of the Necronomicon: It stayed there. I gave it at least 5, maybe 6 minutes, and it didn't move. So I shut her down, started her up again, and now (this is where we currently stand, symptomatically) we have this: Turn on power All fans fire up The front panel LEDs remain lit No video, no beep, no nothing. I'm a software guy; haven't done real hardware troubleshooting in years. My gut tells me that the mobo and/or CPU is fried, and unfortunately my gut didn't get to be as big as it is being wrong all the time. :( In addition to the link above, I have read all of the following (trying to save you some LMGTFY trouble): Gateway Support POST Error Messages and Handling About a zillion (useless) POST beep code sites A kioskea.net post indicating that most likely we're at what I consider "total loss" (mobo and/or CPU) My questions: Are there any conditions other than mobo/CPU that could cause symptoms like these? Is it worth my time to try the next hardware troubleshooting step?(IE, remove all non-critical hardware from the machine, try to boot, systematically replace one by one until we find the failing component) Which mobos will fit in the Gateway GT5220 case (with rear ports correctly aligned)? (Why this is not a dupe: I wouldn't have posted this question if it hadn't been for the funkadelic possessed video display on the one occasion we got video out. I think that justified this not being an exact dupe. Of course, if the community overrules, I will understand.)

    Read the article

  • Log and Block Website using Windows 2008 SBS

    - by John
    A client has asked me to setup Windows 2008 SBS to block and log websites from a list they will provide. As far as I know they only have standard edition which means I cannot use ISA. I was thinking of using squid authenticated against Active Directory. There is no budge for additional software. Does any one know of a different/better solution using either open source software or software that is available in Windows 2008 SBS? Thanks

    Read the article

  • Overclock Failed

    - by John
    This morning when I tried to turn on my computer, the monitor would not display anything. The computer is on, but no UI on the monitor and no beep. I turned off the computer and turn it back on again. This time, the computer was on for about 2 seconds and automatically turned itself off for about 2 second and automatically turned itself back on. This happened two times without the monitor displaying any UI. On the third try, the computer did the same thing but when it turned itself back on, the monitor displayed this: American Megatrends Asus P8P67 Le ACPI bios Revision 1011 CPU Intel Core i5 2500K CPU @ 3.30GHz Speed: 3300MHz Total Memory 4096 MB (DDR3-1333) USB Devices Total: 0 Drive, 1 Keyboard, 1 Mouse, 2 Hubs Detected ATA/ATAPI Deivces SATA6G_2 Hitachi HDs821010CLA 332 SATA3G_1 Corsair force 3 SSD SATA3G_2 HL-DT-ST DB-RE UH12CS28 Overclocking Failed! Please enter setup to reconfigure your system. Press F1 to Run Setup. After I pressed F1, it went into the Asus EF1 Bios Utility. I couldn't do anything in there so I just exited. After which, the computer auto turned off and on again and I was able to use the PC like regular. This has never happened to this computer before and the day before, nothing seemed wrong, just regular use and some gaming.

    Read the article

  • Copying large files from USB devices to the internal hard drive fails on Mac OS

    - by John M. P. Knox
    I have a second-generation 13" MacBook Air running Mac OS X 10.6.6 with a 2.13 GHz processor, 4 GB of RAM, and a 256 GB SSD hard disk. I often get failures when I attempt to copy a large file or large collection of files from an external USB drive (typically a "Firewire" generation Drobo) to the internal drive. The failure behaves almost exactly as if I had pulled the USB cable from the computer in mid-transfer. I get a warning that I have removed the hard disk improperly. After this event, the drive no longer appears mounted in the finder, and I have to unplug and reinsert the USB cable to mount the drive again. I have also seen a similar problem when using Aperture 3 to import a large number of photos and videos from a USB Compact Flash card reader. The import will fail and I will have to unplug the Card Reader and import the missing items. Oddly, reversing the direction of the copy seems pretty reliable. I've never had a problem copying a large file to a USB device, meaning that I have quite a few large files which are stranded on my Drobo. Model Identifier: MacBookAir3,2 Boot ROM Version: MBA31.0061.B01 I have seen a similar issue reported on Apple's website: http://discussions.apple.com/thread.jspa?threadID=2648590&tstart=0 The only suggested resolutions there seems to be switching to another form of connectivity (e.g. firewire, which does not exist on MacBook Air), or downgrading to Mac OS 10.6.4, or reverting the USB kernel extensions to the 10.6.4 versions: http://discussions.apple.com/message.jspa?messageID=12566073#12582956 I'm not too keen on the idea of downgrading kernel extensions. Does anyone know of a hardware revision without this issue that I can trade up to? Are there any other potential solutions out there?

    Read the article

  • OpenSolaris: unable to contact repository, connected to internet via proxy

    - by John-ZFS
    Opensolaris b134: unable to set packages catalog this system is connected to internet via proxy, while this works on browser, how to make console/terminal aware? user1@opensolaris134:~# pkg set-authority -O http://pkg.opensolaris.org/dev opensolaris.org pkg set-publisher: Could not refresh the catalog for opensolaris.org user1@opensolaris134:~# pkg image-update pkg: 0/1 catalogs successfully updated: Unable to contact valid package server Encountered the following error(s): Unable to contact any configured publishers. This is likely a network configuration problem.

    Read the article

  • Installing make with wget

    - by John
    How can I install make with wget on CentOS? I tried: cd /tmp wget ftp.gnu.org/pub/gnu/make/make-3.81.tar.gz tar xfz make-3.81.tar.gz cd make-3.81 PATH=/usr/local/bin:/usr/bin:/bin ./configure patch -p1<make-3.81-cygwin.patch patch -p0<make-3.81-cygwin_MAKE_expansion.patch PATH=/usr/local/bin:/usr/bin:/bin make PATH=/usr/local/bin:/usr/bin:/bin make install However, I got an error message stating I had no acceptable C compiler.

    Read the article

  • IIS Express only utilizes 13% of i7 Quad Core

    - by John Nevermore
    Since one of my scripts got incredibly complex, i was benchmarking the performance of moving some javascript processing logic to the server side in my ASP.NET MVC 4 application. According to taskmgr.exe, IIS Express only utilizes 13% of my i7. I decided to throw in 3 parallel tasks calculating the fibonacci sequence up to 50 and the IIS express still wouldn't utilize more than 13% of my cpu. Is there anything i can do, so that the application utilizes the full cpu, as it would in a real server ?

    Read the article

  • Exchange 2003 IMAP not working for some users

    - by John Gardeniers
    We normally don't have a need for IMAP connections from outside the company network but in order to allow a one user to use IMAP on a portable device I've turned it on and opened port 993 on the firewall. When the user in question was unable to get connected I tested this using Outlook remotely. Start by creating a new IMAP account in Outlook using a test account. No problems, it worked perfectly. Now try the same thing using the account of the user who actually needs to connect and it's a no-go. Outlook simply keeps prompting for logon credentials. Next I tried using my own account and that too failed. Testing with a couple of other accounts worked perfectly. Interestingly enough, with my own account I've used IMAP on a MAC before (internally) without a problem and I'm not aware of anything that has changed which could affect IMAP on my account. Checking the user settings in ADUC showed that all accounts have the same Exchange protocol settings. Specifically, IMAP is enabled. A check of the event logs on the server reveals no entries for the connection attempts, making this kind of difficult to debug. Has anyone here encountered such a situation and, even more importantly, what caused it?

    Read the article

  • check_snmp with snmpv3 protocol giving "Unkown Report message" error

    - by John
    I'm trying to add a nagios command to use snmpv3 for monitoring printer status messages. When using the check_snmp command, I get the following error: External command error: snmpget: Unknown Report message Here is the command I'm typing in: ./check_snmp -P 3 -H <hostname> -L authPriv -U snmpuser -A snmppassword -X snmppassword -o 1.3.6.1.4.1.11.2.4.3.1.2.0 -C public -d "STRING:" -a MD5 These values for auth key, private key, username, etc all work when using snmpwalk. Can someone enlighten me as to what that error message really means? EDIT: It looks like check_snmp isn't taking my v3 credentials when passing over to snmpget. Here is my input with the verbose option: ./check_snmp -H <hostname> -o 1.3.6.1.2.1.2.2.1.10.1 -C public -m ALL -P 3 -L authPriv -U snmpuser -a MD5 -A snmppassword -x DES -X snmppassword -v And here is the output: /usr/bin/snmpget -t 1 -r 5 -m ALL -v 3 [authpriv] <hostname>:161 1.3.6.1.2.1.2.2.1.10.1 External command error: snmpget: Unknown Report message So I guess now my question would be: why isn't check_snmp passing all the commandline options to snmpget?

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >