Search Results

Search found 46908 results on 1877 pages for 'managing files and folder'.

Page 493/1877 | < Previous Page | 489 490 491 492 493 494 495 496 497 498 499 500  | Next Page >

  • How to merge two .iso images

    - by pgrytdal
    I am following this tutorial to install Android onto my computer VIA Virtual Box. My problem is, they want you to download liveandroidv0.3.iso.001 liveandroidv0.3.iso.002 then they want you to merge these two files with cat liveandroidv0.3.iso.001 liveandroidv0.3.iso.002 > liveandroidv0.3.iso in the Terminal. The problem is, when I run the command, I get the following output cat liveandroidv0.3.iso.001 liveandroidv0.3.iso.002 > liveandroidv0.3.iso cat: liveandroidv0.3.iso.001: No such file or directory cat: liveandroidv0.3.iso.002: No such file or directory So, I was wondering if there was an alternative way to merge these files? Or if you guy's could help me merge them this way? Extra info: OS: Ubuntu 12.10 I downloaded the files to my /downloads folder in my home directory.

    Read the article

  • Where are Credentials stored for Network Drives on WinXP?

    - by Tom Tresansky
    I have a drive mapped to a folder on a remote machine that I connect to using the Cisco VPN client. The password to the Windows account I use on that remote machine has changed. I had stored the username/password locally, using Window's remember my password feature, so I wouldn't have to enter it every time (the enter user/password login dialog used to appear each time I attempted to open the remote folder, and I would have to look up and enter my credentials). The password to that remote Windows account has changed. Now, I am no longer prompted to enter a user name / password, but instead, upon trying to open the remote folder, receive a message: unknown user name or bad password. How do I view and change these stored credentials?

    Read the article

  • Convert Public Folders to a PST

    - by TrueDuality
    Alrighty so I've got a tricky one. I currently have a public folder database (edb & stm) residing on an Exchange 2003 folder. I need to export them into a pst file or otherwise make it so that I can manually get the data in it to end-users. I can not use the export feature built into Outlook as some of the folder refer to another server which doen't have the data. Trying only results in the Outlook Client hanging for close to an hour before giving an error about not finding the data. So this will need to be a server side export. There are a few tools out there that seem to be available for converting edb & stm files to psts but they are quite expensive. Does anybody have any ideas?

    Read the article

  • PLEASE HELP RECOVER MY MINT14 BOOT/GRUB [closed]

    - by C2940680
    Hi I have following from [bootinfoscript][1] v0.61 [1Apr-2012]: I tried to do several time to do a boot-repair from YannUbuntu. However, I get error rebooting into my Linux Mint 14 Cinnamon. I have partitioned /boot, /, /home partitions. Could I still use /home partition if I recover files on to external USB and then reformatting the whole hard drive, repartition and use /home from USB drive which I have saved before? Also, I tried to install Qubes 2beta and then deleted the partition where it was stored. Also, also {my bad} I tried to copy the BOOT.CFG from sda6 to sda1 and sda2. All answers appreciated in advance. sda1: __________________________________________ File system: ext2 Boot sector type: - Boot sector info: Operating System: Boot files: /grub/grub.cfg sda2: __________________________________________ File system: Extended Partition Boot sector type: - Boot sector info: sda5: __________________________________________ File system: swap Boot sector type: - Boot sector info: sda6: __________________________________________ File system: ext4 Boot sector type: - Boot sector info: Operating System: Linux Mint 14 Nadia Boot files: /boot/grub/grub.cfg /etc/fstab

    Read the article

  • How to automatically move mail into appropriate folders - Outlook for Mac 2011

    - by user53654
    Is there anyway to have emails automatically placed into folders where the original email resides? Situation: I sort my emails into various folders based on clients. However, when a new email comes in that was a reply to a previous it goes to my inbox versus going into the client folder. I know the typical answer is "just create a rule for that client". The problem with that is my clients change often, and creating a new rule that often is infeasible. I was hoping there might be some way to have a "follow source email" option. That way, when I move email A into folder X, all subsequent replies to email A will automatically be placed in folder X. Any ideas?

    Read the article

  • Dynamically load images inside jar

    - by Rahat Ahmed
    I'm using Slick2d for a game, and while it runs fine in Eclipse, i'm trying to figure out how to make it work when exported to a runnable .jar. I have it set up to where I load every image located in the res/ directory. Here's the code /** * Loads all .png images located in source folders. * @throws SlickException */ public static void init() throws SlickException { loadedImages = new HashMap<>(); try { URI uri = new URI(ResourceLoader.getResource("res").toString()); File[] files = new File(uri).listFiles(new FilenameFilter(){ @Override public boolean accept(File dir, String name) { if(name.endsWith(".png")) return true; return false; } }); System.out.println("Naming filenames now."); for(File f:files) { System.out.println(f.getName()); FileInputStream fis = new FileInputStream(f); Image image = new Image(fis, f.getName(), false); loadedImages.put(f.getName(), image); } } catch (URISyntaxException | FileNotFoundException e) { System.err.println("UNABLE TO LOAD IMAGES FROM RES FOLDER!"); e.printStackTrace(); } font = new AngelCodeFont("res/bitmapfont.fnt",Art.get("bitmapfont.png")); } Now the obvious problem is the line URI uri = new URI(ResourceLoader.getResource("res").toString()); If I pack the res folder into the .jar there will not be a res folder on the filesystem. How can I iterate through all the images in the compiled .jar itself, or what is a better system to automatically load all images?

    Read the article

  • PHP + IIS Application Pool Identity Windows\Temp permissions

    - by Matt Boothman
    I am currently running PHP (5.3) on IIS 7.5 on a Win2k8 R2 Web Edition Server and would like to know what, if any, problems or security vulnerabilities I may introduct into a system by assigning Read, Write, Modify & Execute permissions to either IUSR account or the IIS_USERS group for %SystemRoot%\Temp? Should I be altering permissions to that folder at all (as Windows reminds me I probably shouldn't when i attempt to change them)? Should I create a temp folder somewhere else and set permissions accordingly? The problem is when i set Anonymous Authentication (I'm guessing is a more secure option???) to use the App Pool identity, when starting sessions PHP gets stuck in a loop because it's unable to create session files in the %SystemRoot%\Temp folder due to lack of permission on the application pool user or IIS_USERS group. Another problem being ImageMagick (PHP Extension) is being denied access to %SystemRoot%\Temp to write temporary files so is throwing exceptions. I have tried searching Google however have not found anything that touches upon this subject specifically. Any help greatly appreciated.

    Read the article

  • File permission mask/mode settings for Samba on FreeNAS?

    - by tkahn
    I'm currently working on the Samba settings on a FreeNAS server. When any user creates a file or a folder on the server I want the file or folder to get the following RWX permissions: Folders: drwxrws--- Files: -rwxrws--- To set the permissions like this manually I use chmod 2770 which works great. But I want this to happen automatically and therefore I've added the following lines to smb.conf: create mask = 2770 directory mask = 2770 force create mode = 2770 force directory mode = 2770 But when I test by creating a file in one of the folders it get's these permissions: Folder: drwxrwx File: -rwxrw---- What am I overlooking or doing wrong? Is the order of the lines relevant? Does the setgid digit (the 2 in 2770) mess things up?

    Read the article

  • Way to make video-thumbnails generate from VLC instead of Totem?

    - by nick
    I'm suffering from problem that video-thumbnails do not appear in nautilus for some video files. I just found this bug typefinding: some mpeg files are not identified as mpeg files which seems to address the problem. I don't understand the specifics as reported in this bug report, but it sounds like it's a problem with Totem's interaction with Gstreamer. Since all my videos play fine with VLC (and they don't all play with Totem), I don't use Totem very much. Is there a way to make VLC generate the video-thumbnails instead of having to rely on the buggy gstreamer/totem? I made VLC my default video player but this had no effect on the display of video-thumbnails. If Totem can't play the video file, then I get no thumbnail. But VLC can play the videos fine, so why can't VLC create a video-thumbnail for it?

    Read the article

  • Where are Wireless Profiles stored in Ubuntu

    - by LonnieBest
    Where does Ubuntu store profiles that allow it to remember the credentials to private wireless networks that it has previously authenticate to and used? I just replaced my Uncle's hard drive with a new one and installed Ubuntu 10.04 on it (he had Ubuntu 9.10 on his old hard drive. He is at my house right now, and I want him to be able to access his private wireless network when he gets home. Usually, when I upgrade Ubuntu, I have his /home directory on another partition, so his wireless profile to his own network persists. However, right now, I'm trying to figure out which .folder I need to copy from his /home/user folder on the old hard drive, to the new hard drive, so that he will be able to have wireless Internet when he gets home. Does anyone know with certainty, exactly which folder I need to copy to the new hard drive to achieve this?

    Read the article

  • Localhost permissions given different values in fireftp and cs4 dreamweaver

    - by YsoL8
    While testing a file uploader on my localhost ( mamp on mac ) I've hit a problem. Trying to fix a folder permissions problem, I used CS4 Dreamweaver's permissions screen to set 0777 permissions. However these wouldn't apply and stayed stuck on 0, so I opened fireftp and accessed the folder in the local panel. The permissions there are 0777. So I have a folder that has permissions of 0 and 0777 at the same time. How can I resolve this and make sure the permissions are 0777?

    Read the article

  • SharePoint Designer not syncing consistently

    - by normalocity
    I've got a user who uses SharePoint Designer to maintain an internal intranet site. When syncing (remote-to-local) it appears to work at first, but usually hangs about 2-3 minutes into the sync, when he's syncing it to a sub-folder of his "My Documents". In this case, his "My Documents" is stored on a network share/profile. When I do the same thing, it works for me. The difference? My "My Documents" folder is locally stored. In other words, he's syncing from the remote server, into a network share. I'm syncing from the remote server, into a local drive. Any idea why having the sync destination on a network share, vs. a local drive would cause this? When it locks up, we can navigate to his "My Documents" folder still - so I don't believe that we're loosing connection to his drive - unless perhaps the connection is intermittent, and SharePoint Designer isn't re-trying the sync.

    Read the article

  • Chrome drag and drop download links

    - by Brad
    In Chrome, I used to be able to take a link to a file and just drag to a folder on my system. Chrome would then download whatever resource was at the URL for the link and put it into the folder dropped into. This was particularly handy when using Gmail. If there was an attachment, I could just click and drag it into a folder, and Chrome would download it for me to the correct place. Now I have to hit download, and then drag from the download bar when it is finished. Has this feature been removed? Is there any way to bring it back?

    Read the article

  • How do I open a file with a program via a shortcut from the cmd prompt

    - by PassByReference0
    Here's my predicament: When I add a program's location to my PATH, I can do the following in cmd prompt to open a file in my current directory: notepad++ open_me.txt And this opens open_me.txt in notepad++. However, I don't want to have to add every single program I want to run to my path. What I want is to add a folder called C:\Users\Me\Documents\Programs to my path and just drop shortcuts to various programs into that folder and have them function the same as adding them to my path. So I dropped a link to notepad++.exe named "np" in my folder, and what I got was this: I have to run it with start np (instead of just np) But more importantly, if I try start np open_me.txt, it opens notepad++.exe but looks for open_me.txt in notepad++'s directory. How can I do this properly? (Also, I'd like to be opening notepad++.exe with the shortened name of np)

    Read the article

  • Game Patching Mac/PC

    - by Centurion Games
    Just wondering what types of solutions are available to handle patching of PC/Mac games that don't have any sort of auto updater built into them. In windows do you just spin off some sort of new install shield for the game that includes the updated files, hope you can read a valid registry key to point to the right directory, and overwrite files? If so how does that translate over to Mac where the game is normally just distributed as straight up .app file? Is there a better approach than the above for an already released product? (Assuming direct sells, and not through a marketplace that features auto-updating like Steam.) Are there any off the shelf auto-updater type libraries that could also be easily integrated with a C/C++ code base even after a game has been shipped to make this a lot simpler, and that are cross platform? Also how do auto-updaters work with new OS's that want applications and files digitally signed?

    Read the article

  • Removing write permission on home and public_html on Centos/Cpanel

    - by user5858
    I'm running sites on two Cpanel accounts on my VPS on WHM. I'm using DSO php handler and Apache server on my Web server. After recent intrusion attacks I've chowned to root with permission 555 on $HOME and public_html folder. I'm on VPS with Cpanel on Centos. I'm running CMS based software like Joomla Drupal etc. Will this cause any problem to my VPS installation or server side processes? Drupal, Joomla, MyBB etc will not be affected by this. Some files will not be created like error_log. At least hackers will not be able to place any malicious code within home folder or the public_html folder.

    Read the article

  • What does /dev/null mean in the shell?

    - by rishiag
    I've started learning bash scripting by using this guide: http://www.tldp.org/LDP/abs/abs-guide.pdf However I got stuck at the first script: cd /var/log cat /dev/null > messages cat /dev/null > wtmp echo "Log files cleaned up." What do lines 2 and 3 do in Ubuntu (I understand cat)? Is it only for other Linux distributions? After running this script as root, the output I get is Log files cleaned up. But /var/log still contains all the files.

    Read the article

  • Prevent users from creating / copying / moving anything except .exe

    - by webnoob
    We have a program that compiles executables into a folder into c:\bin. Ideally I would like to share this folder so users can access the exe's within but stop them creating any other files in there. The reason for this is to stop users grabbing source code and putting it in a shared drive then taking it. We have a Domain Controller setup and all the users belong to a specific security group. Is there any way to achieve this? EDIT: TO clarify, I need to stop users from creating or moving files INTO the C:\bin folder which are not executables.

    Read the article

  • Using pscp and getting permission denied

    - by Espen
    I'm using pscp to transfer files to a virtual ubuntu server using this command: pscp test.php user@server:/var/www/test.php and I get the error permission denied. If I try to transfer to the folder /home/user/ I have no problems. I guess this has to do with that the user I'm using doesn't have access to the folder /var/www/. When I use SSH I have to use sudo to get access to the /var/www/ path - and I do. Is it possible to specify that pscp should "sudo" transfers to the server so I can get access to the /var/www/ path and actually be able to transfer files to this folder?

    Read the article

  • Why does Apache ignore my Directory block?

    - by Codemonkey
    I just moved my projects into a new workstation. I'm having trouble getting my Apache installation to acknowledge my .htaccess files. This is my /etc/apache2/conf.d/dev config file: <Directory /home/codemonkey/dev/myproject/> Options -Indexes AllowOverride All Order Allow,Deny Deny from all </Directory> I know the config file is being included by Apache because it complains if I put erroneous syntax in it (Action 'configtest' fails). My project is reachable through Apache by a symlink in the /var/www directory. The server is running with my user and group, so it has my permissions. My entire dev folder has permissions set to 770 recursively. Despite all this, I'm still getting an indexed display of my project folder when I visit http://localhost/myproject. Why isn't the above config making it impossible to view the folder in the browser?

    Read the article

  • Commands don't have permission when using absplute path

    - by Markos
    I have folders set up this way: /srv/samba/video getfacl /srv/samba/video # file: srv/samba/video # owner: root # group: nogroup user::rwx group::--- group:sambaclients:rwx group:deluge:rwx mask::rwx other::--- default:user::rwx default:group::--- default:group:sambaclients:rwx default:group:deluge:rwx default:mask::rwx default:other::--- That means, user deluge has rwx to folder /srv/samba/video. However, when running command as user deluge, I am getting weird permission errors. When in folder /srv/samba/video: sudo -u deluge mkdir foo works flawlessly. But when using absolute path: sudo -u deluge mkdir /srv/samba/video/foo I am getting permission denied. When running sudo -u deluge id, I get output uid=113(deluge) gid=124(deluge) skupiny=124(deluge) which shows that user deluge is indeed in group deluge. Also, the behavior was the same when I gave the permissions also to user deluge not just group deluge. When executing as non-system user, it does work. The reason that I want to use absolute paths is that I am using automatically triggered post-download script which extracts some files into the folder. I have spent way too many hours to solve this problem myself. mkdir isn't the only command that fails, touch is doing the same thing, so I suspect that it's not mkdir's fault. If you need more info, I will try to put it in here, just ask. Thanx in advance.

    Read the article

  • Windows 7 Virtual PC - &ldquo;RPC server unavailable&rdquo;

    - by Kelly Jones
    I use Windows 7 Virtual PC on my current project and I often bring home the files, so I can work some in the evenings.  Since my VHDs are large, I’ll only copy the undo disks, saved state, and virtual machine config files from my external drive.  I copy them to a small portable drive and once I get home, I’ll copy them to a large external drive. I’ve done this for over a year, but recently I started getting an error when I tried to start the VPC after the copying was finished.  It would open the initial window with the progress bar, but eventually the bar would stop, turn red, and then the error “RPC server unavailable” would appear.  When I first started seeing these, I’d try again, but no luck. After some testing, it turns out that my small portable drive is apparently going bad, so it was corrupting the files.  Lucky for me, that I never overwrote my good copies with corrupted copies, at least not at both the office and at home.

    Read the article

  • Panic Transmit file upload

    - by 1ndivisible
    I've ditched Coda and bought Transmit. I'm a little confused by the file uploading. I have exactly the same folder structure remotely and locally, but if I right-click a file and choose Upload "SomeFileName.html" The file is always uploaded into the root of the remote site, even if the file is in a folder. If I choose to upload a file at assets/images/some_image.png I would expect it to be uploaded to the same folder on the remote server, not the root. Coda dealt with this perfectly and also told me what files had been modified and needed uploading. Transmit doesn't seem to do either of these things. So my questions are: How can I upload a file to the same path on the remote server without having to drag and drop Is there any way to have Transmit mark edited files or upload only edited files. [There is no tag for Transmit so if someone with more rep could make and add one that would be grand]

    Read the article

  • Jumpshare Makes It Dead Simple To Drag, Drop, and Share 150+ File Formats

    - by Jason Fitzpatrick
    If you’re looking for a super simple way to share files with friends and coworkers Jumpshare offers drag and drop file transfer with a powerful built in file viewer. You don’t need to register, install any software, or do anything but drag the file, drop it onto the Jumpshare interface, and share the link with your friend. Share the link and your friend can watch the real-time progress of the file upload as well as download or view the completed files within the Jumpshare file viewer. Files are hosted for two weeks before deletion. Hit up the link below to take it for a test drive. Jumpshare HTG Explains: Is ReadyBoost Worth Using? HTG Explains: What The Windows Event Viewer Is and How You Can Use It HTG Explains: How Windows Uses The Task Scheduler for System Tasks

    Read the article

  • At what point is asynchronous reading of disk I/O more efficient than synchronous?

    - by blesh
    Assuming there is some bit of code that reads files for multiple consumers, and the files are of any arbitrary size: At what size does it become more efficient to read the file asynchronously? Or to put it another way, how small must a file be for it to be faster just to read it synchronously? I've noticed (and perhaps I'm incorrect) that when reading very small files, it takes longer to read them asynchronously than synchronously (in particular with .NET). I'm assuming this has to do with set up time for things like I/O Completion Ports, threads, etc. Is there any rule of thumb to help out here? Or is it dependent on the system and the environment?

    Read the article

< Previous Page | 489 490 491 492 493 494 495 496 497 498 499 500  | Next Page >