Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 1162/1620 | < Previous Page | 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169  | Next Page >

  • Remotely Managing Storage on Hyper-V 2012 Core

    - by Vazgen
    I have a core Hyper-V Server 2012 that I am remotely managing from a Windows 8 client. I can connect in Hyper-V Manager, Server Manager, and MMC. However, I don't understand how I can manage the physical hard drive (for ex, deleting vhdx files, creating folders, etc) from my Windows 8 client. I tried to attach the remote share as follows: q: \\MyServer\c$ It said command completed successfully, but I don't see the drive on my client's Explorer. I can get to it in cmd.exe on the client but how can I manage it in a GUI? explorer q: Throws error:

    Read the article

  • Why are my Google searches redirected?

    - by Please Help
    This machine was infected with various malware. I have scanned the system with Malwarebytes. It found and removed some 600 or so infected files. Now the machine seems to be running well with only one exception. Some Google search results are being redirected to some shady search engines. If I were to copy the url from the Google Search results and paste it in the address bar it would go to the correct site but if I click the link I will be redirected somewhere else. Here is my log file from HijackThis: http://pastebin.com/ZE3wiCrk

    Read the article

  • How to setup VPN on home network

    - by Tone
    I am a software developer. I travel and sometimes have a need to access my files at home and tweak other family members computers. I would like to connect to my home network via VPN and then RDP into whatever machine i need to. Currently i have a Windows Server 08 machine, which is my file server, database server, web server (for development work), source control repository, etc. (and also somewhat of a workstation when i need it to be). I want to use this same machine to run my VPN through. I have a linksys WRTG54 router. My ISP is AT&T DSL, with a dynamic IP address - so i'm assuming I'll either need to request a static IP or sign up with one of those static ip services.. where it keeps your dynamic ip synced up with a static one. While I do understand software engineering I am no expert in networking. What do i need to do to setup my VPN?

    Read the article

  • Can't boot Windows after installing Linux

    - by user4035
    I have a partition /dev/sdb1, where my old Windows XP resides. All the files are there intact and I can see them, mounting the disk from Linux. Linux is on /dev/sdb2. But when I choose Windows in LILO prompt, it doesn't load. I have the following lilo.conf: boot = /dev/sdb # Linux bootable partition config begins image = /boot/vmlinuz root = /dev/sdb2 label = Linux read-only # Partitions should be mounted read-only for checking # Linux bootable partition config ends # Windows bootable partition config begins other = /dev/sdb1 label = Windows table = /dev/sdb # Windows bootable partition config ends What can be wrong?

    Read the article

  • Domain Key Entries

    - by natediggs
    More BIND DNS questions. OK, my changes the the zone files are no propagating out. Now I'm having a problem with the domain key entries I'm trying to create. I'm starting by trying to set the domain key policy. To do so I added the following entry to my zone file (actual domain XXX'd out). Based on everything I've read this is properly formatted and should work. When I try to verify the DNS entry for our domain it doesn't how up. _domainkey.XXXX.com TXT "t=y; o=~;" Is there something I'm missing? Nate

    Read the article

  • Using the promote builds plugin to tag subversion repository in jenkins

    - by mark
    We have a task which builds based on data from 4 different SVN repositories. I want to allow QA promote a build, so that the revisions participating in the build are tagged with the build number and some optional label. I have encountered the following problem - the promoted build may not be the most recent build. How do I know the SVN revision of each of the four repositories used during that build? I know that each build has this information in the revision.txt and build.xml files associated with the build, but how does it become available in the context of promotion? Thanks. P.S. Asked here before, but did not get a satisfying answer.

    Read the article

  • How can I shrink my Windows partition further than the disk management is allowing?

    - by Walkerneo
    I just bought a new computer with a 2tb hard drive that has only a single partition. I would like to divide this into at least 4 partitions, but when I try to shrink the current partition, it says the total size is 1888171 MB and that the size of available shrink space is only 939075 MB. The used disk space is at 40gb right now - why can't shrink it to somewhere around that? I read here: http://www.howtogeek.com/howto/windows-vista/working-around-windows-vistas-shrink-volume-inadequacy-problems/ that this is because of unmovable system files. I doubt this is the only problem though. I would like to get this partition down to 500gb. How can I do this?

    Read the article

  • debian installation without internet connection

    - by Gobliins
    Hi i want to install some Debian distributions (Grip, Crush, Lenny...) for arm / armel architectures. www.emdebian.org/ i refer to this guide www.aurel32.net/info/debian_arm_qemu.php The Problem i have is that i dont have internet connection with My Linux VM or Qemu i am behind a Proxy. I want to know is there a way where i can dl all the needed files and save them to disk that i don´t need an i.c. during the installation? I am working under Windows now. my regards

    Read the article

  • Thunderbird 3 not displaying email in local folder

    - by ron grubman
    I installed TB3 on a new computer set up to use imap to get my gmail -- works fine I copied all of my existing emails from old computer into the appdata directiry fir thunderbird local folders. All files are of the right size that I know the data is present When I run tb on the new computer all of the FOLDERS show up under LocalFolders. 5, BUT BUT BUT, when I click on any folder, all I get is a blank -- no emails are shown. additional info: Bitdefender DOES see the missing emails because one or two are tagged as suspicious But Windows 7 search is NOT seeing the missing emails. Can anyone instruct me on how I can get tbird to show the old emails ??? thanks

    Read the article

  • How to use ccache selectively?

    - by Anonymous
    I have to compile multiple versions of an app written in C++ and I think to use ccache for speeding up the process. ccache howtos have examples which suggest to create symlinks named gcc, g++ etc and make sure they appear in PATH before the original gcc binaries, so ccache is used instead. So far so good, but I'd like to use ccache only when compiling this particular app, not always. Of course, I can write a shell script that will try to create these symlinks every time I want to compile the app and will delete them when the app is compiled. But this looks like filesystem abuse to me. Are there better ways to use ccache selectively, not always? For compilation of a single source code file, I could just manually call ccache instead of gcc and be done, but I have to deal with a complex app that uses an automated build system for multiple source code files.

    Read the article

  • Security considerations in providing VPN access to non-company issued computers [migrated]

    - by DKNUCKLES
    There have been a few people at my office that have requested the installation of DropBox on their computers to synchronize files so they can work on them at home. I have always been wary about cloud computing, mainly because we are a Canadian company and enjoy the privacy and being outside the reach of the Patriot Act. The policy before I started was that employees with company issued notebooks could be issued a VPN account, and everyone else had to have a remote desktop connection. The theory behind this logic (as I understand it) was that we had the potential to lock down the notebooks whereas the employees home computers were outside of our grasp. We had no ability to ensure they weren't running as administrator all the time / were running AV so they were a higher risk at being infected with malware and could compromise network security. With the increase in people wanting DropBox I'm curious as to whether or not this policy is too restrictive and overly paranoid. Is it generally safe to provide VPN access to an employee without knowing what their computing environment looks like?

    Read the article

  • Can I use Distant tv server with xbmc?

    - by Chiyou
    I want to use distant tv server with xbmc because it seems to work better then mediaportal. I have tried mediaportal only for 1-2 days but I get already a couple of problems. The Pinnacle software that comes with my usb stick and distant tv server is a superior solution. How can I get it work with xbmc? Or How can I map a remote control key to pctv and a remote control key to xbmc? I want to watch TV and browse my video files?

    Read the article

  • How can I make a non-destructive copy of a (NTFS) partition?

    - by violet313
    I want to recover some deleted files from a healthy NTFS partition on an undamaged hard-disk. In order to leave the partition undisturbed, i plan to use dd to clone the partition to a raw image file & then attempt recovery from that mounted clone. Will dd if=/dev/sd<xn> of=/path/to/output.img perform a non-destructive copy ? Is attempting a restore from a clone using dd the best approach? [edit, wrt Deltiks answer, i need to be a bit clearer about what i'm asking] eg: are there some s/w that can do something more with the original sectors ? eg: if it was a damaged hard-disk i am aware that any kind of read is potentially destructive. but assuming my disk head is not going to suddenly spaz out etc, am i reducing my chances of a successful recovery (at any cost) by using an apparently non-destructive single read of my undamaged hard-disk. (btw: i am planning on using ntfsundelete & testdisk for recovery)

    Read the article

  • writting becomes slow after few writes

    - by user1566277
    I am running an embedded Linux on arm with a SD-Card. While writing huge amounts of data I see bizarre effects. E.g, when I dd a 15 MB file few times, it writes the file (normally) in less than 2 Secs. But After lets say 3-4 times it takes sometimes 15 to 30 Seconds to write the same file. If I sync after writing the file, then this does not happen but sync takes long time too. If there is enough gap between writing two files than presumably kernel syncs itself. How can I optimize the whole performance so that write should always finish inside 2 Seconds. The File system I am using is ext3. Any pointers?

    Read the article

  • Windows 7 -Can't get access to shared folder from one computer to another.

    - by Carbonara
    I have 2 windows 7 computers and i'm trying to share a folder (that I want password protection on) outside of the homegroup. Both computers are part of the same workgroup and I have the same user account/password combination on both computers plus I have password protected sharing turned on in the network and sharing centre along with file and printer sharing turned on. On computer 1 I have right clicked and selected that I want the folder shared. When I navigate via the network on computer 2 to computer 1 the shared folder shows up on computer 2 but double clicking on it to open it gives me an alert saying I don't have permission to access it, no option to type in the user name and password (according to the help files I shouldn't even need to type the password in if both computers have the same username/password anyway but would need it if I'm logged in as a different user). It's just a blanket denial of access.

    Read the article

  • Tar and gzip together, but the other way round?

    - by Boldewyn
    Gzipping a tar file as whole is drop dead easy and even implemented as option inside tar. So far, so good. However, from an archiver's point of view, it would be better to tar the gzipped single files. (The rationale behind it is, that data loss is minified, if there is a single corrupt gzipped file, than if your whole tarball is corrupted due to gzip or copy errors.) Has anyone experience with this? Are there drawbacks? Are there more solid/tested solutions for this than find folder -exec gzip '{}' \; tar cf folder.tar folder

    Read the article

  • How to get Apache to follow symlink instead of downloading it?

    - by user792445
    I am just using the standard apache config file which mentions that it follows symlinks, but when I hit the url http://localhost/test it downloads the symlink file instead of following it. What config do I need to change to get apache to follow the symlink instead of downloading it? This is an ls on the directory: $ ls -al total 10 drwx------+ 1 SYSTEM SYSTEM 0 Oct 20 10:55 . drwx------+ 1 SYSTEM SYSTEM 0 Aug 26 12:27 .. -rw-r--r--+ 1 me None 47 Oct 20 10:14 index.html lrwxrwxrwx 1 me None 29 Oct 19 17:10 test -> /home/me/projects/test This is in my apache config file: <Directory "D:/Program Files (x86)/Apache Software Foundation/Apache2.2/htdocs"> Options Indexes FollowSymLinks AllowOverride None Order allow,deny Allow from all </Directory>

    Read the article

  • How to auto-update a website mirror with exceptions to certain pages?

    - by tomatosalad
    I'm currently mirroring a website on my server. The site itself is rarely updated, but it is updated enough that info can become outdated quickly. I mirrored it first with wget, and this worked fine, but I made some changes: The original index.html used frames, but the site also provides a main.html which is essentially index.html but without frames. I deleted index.html and renamed main.html. I did not want to mirror the webchat, blog or forum, so I deleted those files and directories and made directories "blogs" "forum" and "chat" and placed a php redirect in each of those, redirecting visitors to the orignal site. I'd like to auto-update the mirror (maybe once every 24-72 hours), but preserve the changes I made. Is this possible? How would I go about doing it? I am completely clueless as to how. Thanks for any and all help! :)

    Read the article

  • Network Performance issue

    - by qubemarker
    We have three Ubuntu 10.04 servers. One server is a storage server and the other two servers are configured as clients. The storage server has a good amount of capacity and it is integrated with windows Active directory server for Authentication. I am uploading some video files from both clients to the server and when I am uploading data from any one client alone I get about 26 MB/s data transfer rate. When I upload data from both the clients simultaneously I am only getting about 8 MB/s from each client. I have gigabit ethernet cards in all of the servers and a L2 Managed gigabit switch for connectivity. I don’t know why the data transfer rate is decreasing so much in simultaneous read and write. I have tried all of the TCP stack related settings suggested here. Can any assist with getting better read/write performance out of this setup? Any help is appreciated.

    Read the article

  • Backup of "Leavers" network directory

    - by Mez
    I want to create a backup of a Leavers network home directory. I've generally done this before by just creating an iso with genisoimage and then burning it. However, it seems that the latest users have 10G in their files. For archival purposes, I want to be able to burn these to multiple DVDs. How do I create these DVD iso images (I know it's got something to do with tar and stream-media-size, and then how do I restore them if I need them again? Using Debian

    Read the article

  • What is the @ sign on the end of file permission on terminal?

    - by shannoga
    I have a sound file in my app that the iPhone does not play. After checking other problems I checked the file permission in terminal. What I can see is that the file permission of this file has a- @ at the end of it. I don't know if that is the problem but this is the only difference from the other sound files that plays fine. What is this sign ? Could it cause a problem ? EDIT Thanks this is what I get: com.apple.FinderInfo: 00000000 4D 34 41 20 68 6F 6F 6B 00 00 00 00 00 00 00 00 |M4A hook........| 00000010 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................| 00000020 Thanks Shani

    Read the article

  • What is the proper way to set up the Apache document root in terms of privileges?

    - by racl101
    I have just installed Ubuntu 9.10 server edition on my machine and I wish to run my own personal local server with other users in the same LAN. First, I was wondering what folder directory structure is best for the web root? Should I just use: /var/www/ and start throwing web documents there or should I create a folder elsewhere (maybe the home directory)? Second, in the /var/www/ directory only the root user can create documents in there, however, I wish to have other users be able to create files in the document root and upload them via FTP. Should I change the permissions or the www/ folder? Or again, should I create the document root elsewhere with different permissions? What is the safest way of doing this?

    Read the article

  • Vim Misbehaving

    - by zchtodd
    I'm not sure what changed, but lately Vim has been driving me nuts. Whenever I try to do a column mode insert, vim takes my current character and adds to the last character I inserted. For example, the first time I do a block comment by inserting # on multiple lines, it works fine. The next time, however, I end up with ## inserted on every line, and the problem just compounds from there. To do this, I'm hitting Ctrl-V, down or up arrow, Shift-I, #, and then Esc. This worked for months, but now it seems to be pasting extra stuff in. I've tried disabling all .vimrc files, but the behavior remains the same. Any ideas?

    Read the article

  • How to Split a Big Postscript file (3000 pages) into one individual file per page (using Windows 7)?

    - by Pablo
    Hi, I'm having trouble doing the following: I have a big PDF file that I converted to postscript (for commercial printing). The resulting file is too big to be processed by the printer (machine). I've been trying to find a way to either: Convert from the original (many pages) PDF file to many Postscript file (one postcript file per PDF page in original PDF file(. Convert from PDF to PS (or even EPS). - I managed to do this Then split the PS file into a collection of smaller files. I've tried using Ghostscript, but it is all gibberish to me. Thanks. PS. If you have a good GS tutorial (for dummies?), please share the link.

    Read the article

  • rsync not using forwarded ssh credentials

    - by Mat
    I have a situation where I would like to rsync some files from a remote server to a server in my office. The source server requires key-based authentication and I have an appropriate key set up on my desktop machine. If I ssh into the local server and then ssh to the remote server, ssh agent forwarding works correctly. However, when I try to rsync over ssh I get permission denied. So, Desktop -- Local server -- Remote server. When ssh'd onto the local server ssh user@remote works, but rsync -avPe ssh user@remote:/src /dest does not - Permission denied (publickey).

    Read the article

< Previous Page | 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169  | Next Page >