Search Results

Search found 89530 results on 3582 pages for 'file read'.

Page 234/3582 | < Previous Page | 230 231 232 233 234 235 236 237 238 239 240 241  | Next Page >

  • Howto get exit code of a script started in screen session

    - by Bettina
    Hi folks, I am currently creating a backup script which uses screen to start a backup job with rsync inside a screen session. The backup jobs are started as follows. screen -dmS backup /usr/bin/rsync ... As soon as the rsync job is finished, the screen session is terminated automatically. To make sure, that the backup was successful, I would like to check the exit code of the rsync job but unfortunately I really don't know how to get the exit code after the screen was terminated. Does someone have a good idea how to automatically check, if the rsync job was successful or not? Would be great if someone does. I already thought about using a temp file but like this: screen -dmS myScreen "rsync -av ... ; echo $? /tmp/myExitCode" but this unfortunately does not work. Then I thought about using stderr like in the example below: screen -dmS myScreen "rsync -av ... 2 /tmp/rsync-sterr None of my ideas worked out so far, since stderr is not written when I use the command above. :-( ? Would be great if someone has a good idea or even a solution. Cheers, Bettina

    Read the article

  • How to perform this Windows 7 permissions change on many files via GUI or command line

    - by hippietrail
    After using my external hard drive on another Windows 7 computer to tweak photos with Windows Live Photo Gallery then upload them to Facebook I found the modified images were now not visible on the original Windows 7 computer. I'm not sure if the things I tried to get it working subsequently changed anything, but I do know this is the sequence of actions that makes the permissions of the modified files match those of the unmodified files: Right click on broken image file, select "Properties" On the "Security" tab press the "Advanced" button In the "Permissions" tab press the "Continue" button with the shield icon on it Tick the box marked "Include inheritable permissions from this object's parent Click the "Remove" button to remove the only current entry "Type: Allow, Name: Administrators (XYZ\Administrators), Permission: Full control, Inherited From: OK on the "Permissions" tab. OK on the "Security" tab. Now this same procedure does not work at the folder level. It results in "access denied" dialogs. I'm looking for some way to perform this exact modification on all the images I edited on the other computer. I'm happy to use the Windows GUI in Explorer or any other included tools. I'm happy to use the Windows command line. I'd prefer not to use a third-party tool since I'd have to be satisfied it's not doing anything else. I'm not looking for a different way to change permissions to other settings to make an external drive full of photos editable on multiple computers. At least not in this question.

    Read the article

  • Determining physical location of data on a disc

    - by Synetech
    Does anybody know of a way to find out where, physically on a CD or DVD a given piece of data would be located? I am trying to watch a DVD at the moment, and am about half-way through, but it keeps dying at a specific spot in the film, presumably because of a scratch. I have a repair kit, but I don’t know where to focus my repair because there are several scuffs and scratches on the disc and I have no way of knowing which one is causing the issue. Obviously, cleaning all of them is inadvisable because not only does it waste the consumable materials in the kit, but not all of them are a problem, and by working them, some may become unreadable. Moreover, just because I am half-way through the movie does not mean that it would be half-way from the hub to the edge for several reasons: Discs have more data towards the outer edge than the inner edge (circles are more mathematically complicated than rectangles) The disc is not completely filled up (and even if it were, the movie itself would be be using it all, there are extras and such) Because in this particular case it is a commercial DVD, it is also dual-layer which further complicates manual determination As such, I am trying to find a program that can let me identify a file (or part thereof), cluster, etc. and show me a picture of where on the CD/DVD it would be located. That way, I can look at the disc and fix any scratches that correspond to that distance from the hub. For example, the image below might indicate where on a disc a couple of files or range of clusters would be located, so by looking for anomalies in those areas (rotating as necessary), the correct one can be identified. I’m sure it can be done since at least one form of copy protection (DPM) uses it and DVD-lab Pro includes a “DVD Topology” feature to do this.

    Read the article

  • How do I fully share a Hard Drive on my Local Network?

    - by GingerLee
    I have 4 computers connected to a router (DD-WRT) My main PC is Windows 7 (Home Premium). This machine has 2 Hard Disks: HD1 is used for my OS and the other (HD2) is used to store files. My 3 other machines are 1. Ubuntu Destop that I use to learn about linux, 2. A Mac OSX laptop, and 3. A netbook running windows 7. How do I easily share HD2 with my other machines? I would like all my machines to have full access & permissions to HD2 however I would like to RESTRICT access to only PCs that are connected to my router (either via LAN and WiFi) --- btw, I know this is not very secure due to WiFi vulnerability , however, I currently MAC address restrict WiFi connections my router. Extra Info: I have already tried to use the Windows Folder Sharing feature: i.e. I right click over the icon of HD2, and click on the Sharing Tab, but in sub-window labeled "Network File and Folder Sharing", the "Share" button is grayed out. I can click on "Advanced Shared" but that just takes me to a screen in which I have to set certain permissions. What is not clear to me is: How do I set a criteria that shares HD2 with all computer connected to my router?

    Read the article

  • My Hard Drive isn't Working

    - by MeCB
    I never use Safely Remove Hardware with Windows XP. It has been working for me for years with my SD card, mouse, hard drive and memory stick. My hard drive has a USB cable and power cord so I can hook it up to any desktop hard drive. Now my hard drives don't like this and I did not know this till now. I am always careful and to wait till it is all finish accessing the USB before I unplug it. Now three of my hard drives can't be seen by Windows, though the others still work. when I hook it up to another computer it works fine. I use the same USB cable to hook up all of my hard drives one at a time. So my USB cable is good. I think that when I unplugged the hard drive this one time, it had a file it still wanted to see and now only this drive does not work only this computer. Then the same thing happens to my other two hard drive after I used it for a week with the same cables. How can I fix this?

    Read the article

  • Adding a hyperlink in a client report definition file (RDLC)

    - by rajbk
    This post shows you how to add a hyperlink to your RDLC report. In a previous post, I showed you how to create an RDLC report. We have been given the requirement to the report we created earlier, the Northwind Product report, to add a column that will contain hyperlinks which are unique per row.  The URLs will be RESTful with the ProductID at the end. Clicking on the URL will take them to a website like so: http://localhost/products/3  where 3 is the primary key of the product row clicked on. To start off, open the RDLC and add a new column to the product table.   Add text to the header (Details) and row (Product Website). Right click on the row (not header) and select “TextBox properties” Select Action – Go to URL. You could hard code a URL here but what we need is a URL that changes based on the ProductID.   Click on the expression button (fx) The expression builder gives you access to several functions and constants including the fields in your dataset. See this reference for more details: Common Expressions for ReportViewer Reports. Add the following expression: = "http://localhost/products/" & Fields!ProductID.Value Click OK to exit the Expression Builder. The report will not render because hyperlinks are disabled by default in the ReportViewer control. To enable it, add the following in your page load event (where rvProducts is the ID of your ReportViewerControl): protected void Page_Load(object sender, EventArgs e) { if (!IsPostBack) { rvProducts.LocalReport.EnableHyperlinks = true; } } We want our links to open in a new window so set the HyperLinkTarget property of the ReportViewer control to “_blank”   We are done adding hyperlinks to our report. Clicking on the links for each product pops open a new windows. The URL has the ProductID added at the end. Enjoy!

    Read the article

  • Using XNA ContentPipeline to export a file in a machine without full XNA GS

    - by krolth
    My game uses the Content Pipeline to load the spriteSheet at runtime. The artist for the game sends me the modified spritesheet and I do a build in my machine and send him an updated project. So I'm looking for a way to generate the xnb files in his machine (this is the output of the content pipeline) without him having to install the full XNA Game studio. 1) I don't want my artist to install VS + Xna (I know there is a free version of VS but this won't scale once we add more people to the team). 2) I'm not interested in running this editor/tool in Xbox so a Windows only solution works. 3) I'm aware of MSBuild options but they require full XNA I researched Shawn's blog and found the option of using Msbuild Sample or a new option in XNA 4.0 that looked promising here but seems like it has the same restriction: Need to install full XNA GS because the ContentPipeline is not part of the XNA redist. So has anyone found a workaround for this?

    Read the article

  • How to reload sarg configuration file

    - by black sensei
    I have installed sarg for a while on ubuntu 12.04 and forgotten that I wanted it to generate reports inside /var/www/vhosts/reports.lan/htdocs. I have found out this morning and made the change in the /etc/sarg/sarg.conf. After I have manually run sarg with # sarg-reports today but it generated in the old folder /var/lib/sarg . I know I could do a symbolic link but I was surprised that I could not find a single command to reload or at least restart sarg. Can anyone give me the command to restart or reload it? Thanks

    Read the article

  • Umbraco Permissions Script - Secure Version

    - by Vizioz Limited
    Back in May I blogged about how to set Permissions for Umbraco using SetACL to set the appropriate directory permissions based on the installation recommendations.Recently I have been working on a site for a client who wanted every security item to be locked down as tightly as possible. And so I modified the script based on the Umbraco security best practices, I thought I'd share it with everyone, if I have missed anything, or if anyone has any suggestions on how to improve this, please let me know :)Please refer to my previous post regarding the SetAcl command line application that you will need.I suggest you save the following into a batch file called: umbPermSecure.batecho offREM Script to setup the Security Permissions for an Umbraco siteREM This script will give your machine Network Service the minimum rights requiredREM for Umbraco to workREM I suggest you update this script to also remove any users who do not need REM access to the web foldersREM **** Pre-requisites ****REM You will need to download - http://setacl.sourceforge.net/REM It is assumed that you have stored SetACL in a directory called, C:\SetACL ifREM not, you will need to modify the script.REM **** Usage ****REM You need to pass in the path for the root of your Umbraco directoryREM E.g. umbPermSecure.bat C:\inetpub\umbracoroot@echo umbPermSecure.bat - Script to set Umbraco File and Directory Permissions@echo based on the Umbraco Security Best Practices Document (13th March 2009)@echo Published by Chris Houston - 19th October 2009@echo http://blog.vizioz.com@echo Adding READ only access SetACL.exe -on "%1" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"SetACL.exe -on "%1\web.config" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"SetACL.exe -on "%1\bin" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"SetACL.exe -on "%1\umbraco" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"@echo Adding READ and EXECUTE access SetACL.exe -on "%1\app_code" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read_ex" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"SetACL.exe -on "%1\usercontrols" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read_ex" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"@echo Adding READ, WRITE and MODIFY access SetACL.exe -on "%1\config" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read" -ace "n:%computername%\NETWORK SERVICE;p:change" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"SetACL.exe -on "%1\css" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read" -ace "n:%computername%\NETWORK SERVICE;p:change" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"SetACL.exe -on "%1\data" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read" -ace "n:%computername%\NETWORK SERVICE;p:change" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"SetACL.exe -on "%1\masterpages" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read" -ace "n:%computername%\NETWORK SERVICE;p:change" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"SetACL.exe -on "%1\media" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read" -ace "n:%computername%\NETWORK SERVICE;p:change" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"SetACL.exe -on "%1\python" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read" -ace "n:%computername%\NETWORK SERVICE;p:change" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"SetACL.exe -on "%1\scripts" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read" -ace "n:%computername%\NETWORK SERVICE;p:change" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"SetACL.exe -on "%1\xslt" -ot file -actn ace -ace "n:%computername%\NETWORK SERVICE;p:read" -ace "n:%computername%\NETWORK SERVICE;p:change" -actn clear -clr "dacl,sacl" -log "c:\setacl\log.txt"

    Read the article

  • 550 “Overwrite permission denied” when editing a file via FTP

    - by nodebunny
    DreamHost recently moved my accounts to a new shared box, and now I can't edit files via UltraEdit's built in FTP client, which messes up my work flow! What did they do that this is not working now? It stopped working after they moved me. Here's the output from the FTP console in UltraEdit 10/26/2011 10:42:36 AM: 220 DreamHost FTP Server 10/26/2011 10:42:36 AM: USER nodebunny 10/26/2011 10:42:36 AM: 331 Password required for ninjawww 10/26/2011 10:42:36 AM: PASS xxxxxxxx 10/26/2011 10:42:36 AM: 230 User nodebunny logged in 10/26/2011 10:42:36 AM: FEAT 10/26/2011 10:42:36 AM: 211-Features: LANG ja-JP.UTF-8;ja-JP;zh-TW;fr-FR;zh-CN;en-US*;bg-BG;ko-KR.UTF-8;ko-KR MDTM MFMT TVFS UTF8 MFF modify;UNIX.group;UNIX.mode; MLST modify*;perm*;size*;type*;unique*;UNIX.group*;UNIX.mode*;UNIX.owner*; REST STREAM SIZE 211 End 10/26/2011 10:42:36 AM: OPTS UTF8 ON 10/26/2011 10:42:36 AM: 200 UTF8 set to on 10/26/2011 10:42:36 AM: PWD 10/26/2011 10:42:36 AM: 257 "/" is the current directory 10/26/2011 10:42:36 AM: PWD 10/26/2011 10:42:36 AM: 257 "/" is the current directory 10/26/2011 10:42:36 AM: CWD /dev/proj/nodebunny 10/26/2011 10:42:36 AM: 250 CWD command successful 10/26/2011 10:42:36 AM: PWD 10/26/2011 10:42:36 AM: 257 "/dev/proj/nodebunny/lib/Buffer" is the current directory 10/26/2011 10:42:36 AM: PWD 10/26/2011 10:42:37 AM: 257 "/dev/proj/nodebunny/lib/Buffer" is the current directory 10/26/2011 10:42:37 AM: TYPE I 10/26/2011 10:42:37 AM: 200 Type set to I 10/26/2011 10:42:37 AM: PORT 10,15,55,125,226,16 10/26/2011 10:42:37 AM: 200 PORT command successful 10/26/2011 10:42:37 AM: STOR Buffer.pm 10/26/2011 10:42:37 AM: 550 Buffer.pm: Overwrite permission denied

    Read the article

  • NTFS partitions hidden under EXT4 file system / partition...want to recover files from NTFS

    - by user7534
    I am new to ubuntu, but very impressed with the system. so one day i tried installing ubuntu 10.10 along with windows in dual boot first place it didnt get installed properly and during second attempt i could do it right but oh...i lost my windows 7 , here is my problem and what i have done till now. i have hdd installed with ubuntu same disk have windows partitions and i need to extract data from those ...very very important i tried to access the same from ubuntu ...can not access it, 3.reinstalled the windows 7 , hdd is not detected 4.during installation ubuntu gone , so reintalled scan in ubuntu says hdd is fine and DiskInternals linux reader actual show the NTFS partitions , recovery tool not able to get any data out. , please help i need data from these partitions...please I feel that i have put ext4 partition on ntfs filesystem...and now not able to access it

    Read the article

  • 30 seconds from File|New to a new CRUD Silverlight application with Teleriks new LINQ Implementation

    Last month Telerik released its new LINQ implementation and last week we released the new Data Services Wizard for Telerik OpenAccess, which supports both traditional OpenAccess entities and the new LINQ implementation. I will a walk you through the process where you can connect to a database, add a new domain model, wrap it in a new WCF Data Services (Astoria) service, and add a CRUD enabled Silverlight application. All in 30 seconds! Step 1: Build your Domain Model (20 seconds) Open Visual Studio 2010 RTM (or 2008) and add a new ASP.NET project. Right click on the project and select Add|New Item and choose Telerk OpenAccess Domain Model from the item template list. The Visual Entity Designer wizard comes up. Select the database server you are using in the first screen (SQL Server, Oracle, SQL Azure, MySQL, etc) and then also build your database connection string. Next select the tables, views, and stored procedures you want ...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • get the path of moved directory/file (source path and destination path)

    - by Ghanshyam Rathod
    When one directory is moved to other destination path, then i want to log the entry of original source path and entry of the path where it moved. any command is already available ? or any other way of doing this task? Ex: D1 = /home/user/Documents/test D2 = /home/user/Documents/Data/test when i moved D1 directory to the destination path to D2 directory then i want to log, like "test" directory is moved from D1 to D2 path Thanks

    Read the article

  • NTFS partitions hidden under EXT4 file system / partion...want to recover files from NTFS

    - by user7534
    Hi all, I am new to ubuntu, but very impressed with the system. so one day i tried installing ubuntu 10.10 along with windows in dual boot first place it didnt get installed properly and during second attempt i could do it right but oh...i lost my windows 7 , here is my problem and what i have done till now. i have hdd installed with ubuntu same disk have windows partitions and i need to extract data from those ...very very important i tried to access the same from ubuntu ...can not access it, 3.reinstalled the windows 7 , hdd is not detected 4.during installation ubuntu gone , so reintalled scan in ubuntu says hdd is fine and DiskInternals linux reader actual show the NTFS partitions , recovery tool not able to get any data out. , please help i need data from these partitions...please I feel that i have put ext4 partition on ntfs filesystem...and now not able to access it

    Read the article

  • Ubuntu One file sync error: SSL Handshake

    - by Jay Ó Broin
    Ubuntu One repeatedly tries to sync my files but keeps disconnecting before anything is uploaded. Here are some of the messages from syncdaemon.log: 2012-01-08 12:12:34,068 - ubuntuone.SyncDaemon.ActionQueue - INFO - Connection started to host fs-2.ubuntuone.com, port 443. 2012-01-08 12:12:34,256 - ubuntuone.SyncDaemon.ActionQueue - INFO - Connection made. 2012-01-08 12:12:34,257 - ubuntuone.SyncDaemon.StorageClient - INFO - Connection made. 2012-01-08 12:13:08,832 - ubuntuone.SyncDaemon.StorageClient - INFO - Connection lost, reason: [Failure instance: Traceback (failure with no frames): <class 'OpenSSL.SSL.Error'>: [('SSL routines', 'SSL23_READ', 'ssl handshake failure')]]. 2012-01-08 12:13:08,833 - ubuntuone.SyncDaemon.ActionQueue - INFO - The request 'protocol_version' failed with the error: [('SSL routines', 'SSL23_READ', 'ssl handshake failure')] 2012-01-08 12:13:08,844 - ubuntuone.SyncDaemon.ActionQueue - WARNING - Connection lost: [('SSL routines', 'SSL23_READ', 'ssl handshake failure')] 2012-01-08 12:13:38,550 - ubuntuone.SyncDaemon.Main - NOTE - ---- MARK (state: <State: 'WAITING' (queues WORKING connection 'With User With Network')>; queue: 1378; hash: 0) ---- 2012-01-08 12:15:08,870 - ubuntuone.SyncDaemon.ActionQueue - INFO - Connection started to host fs-2.ubuntuone.com, port 443. 2012-01-08 12:15:09,033 - ubuntuone.SyncDaemon.ActionQueue - INFO - Connection made. 2012-01-08 12:15:09,034 - ubuntuone.SyncDaemon.StorageClient - INFO - Connection made. 2012-01-08 12:15:33,676 - ubuntuone.SyncDaemon.StorageClient - INFO - Connection lost, reason: [Failure instance: Traceback (failure with no frames): <class 'OpenSSL.SSL.Error'>: [('SSL routines', 'SSL23_READ', 'ssl handshake failure')]]. 2012-01-08 12:15:33,677 - ubuntuone.SyncDaemon.ActionQueue - INFO - The request 'protocol_version' failed with the error: [('SSL routines', 'SSL23_READ', 'ssl handshake failure')] 2012-01-08 12:15:33,692 - ubuntuone.SyncDaemon.ActionQueue - WARNING - Connection lost: [('SSL routines', 'SSL23_READ', 'ssl handshake failure')] 2012-01-08 12:15:38,551 - ubuntuone.SyncDaemon.Main - NOTE - ---- MARK (state: <State: 'WAITING' (queues WORKING connection 'With User With Network')>; queue: 1378; hash: 0) ---- I'm using Ubuntu 11.10.

    Read the article

  • rsnapshot intervals in configuration file…

    - by Patrick
    A simple question about rsnapshot. In order to perform daily backups I'm going to add lines to cron in my Ubuntu. Then, why do I have also these lines in the rsnapshot.conf ? ######################################### # BACKUP INTERVALS # # Must be unique and in ascending order # # i.e. hourly, daily, weekly, etc. # ######################################### interval hourly 6 interval daily 7 interval weekly 4 #interval monthly 3 If I use cron, should I disable them ? thanks ps. I've just realized that in the crontab I still have "hourly" and "daily". Should I then uncomment only the one I use in the crontab ? And what's the point to specify hourly if it is already specified in cron ? I'm a bit confused. # crontab -e 0 */4 * * * /usr/local/bin/rsnapshot hourly 30 23 * * * /usr/local/bin/rsnapshot daily

    Read the article

  • cannot play VTC video file (.mov) using VLC player

    - by varunit
    I have a training course contains .mov files which have video codec as apple graphics smc. I have googled and found out to add these options to play them link from where I got help But they are playing fine only in SMplayer. I want to configure similarly and play those files in VLC player. This is the error which I'm facing Running vlc with the default interface. Use 'cvlc' to use vlc without interface. [0xb0d03d20] blend blend error: no matching alpha blending routine (chroma: RGBA - RGBP) [0xb0d03d20] main blend error: blending RGBA to RGBP failed It is repeating... Thanks for any help...

    Read the article

  • What are some fast methods for navigating to frequently used folders in Windows 7?

    - by fostandy
    (This is a followup question from my previous question.) In windows XP I used to be able to quickly navigate to frequently used folders by making use of the 'Favorites' menu item and the hotkey behaviour. In certain conditions it could be set up so that getting to a particular folder was as easy as alt-a x (and without a file explorer window open it was as fast as win-e alt-a x). I am struggling to get anywhere near this speed in Windows 7 and would like to solicit advice from others regarding fast folder navigation to see if I am missing any methods. My current way to navigate quickly is basically move hand to mouse move cursor to navigation pane/pain. scroll all the way to the top (because normally I the panel is focused on whatever deep directory structure I am already in). sift through my 50+ favorites to get the one I want, or click a link to a folder that contains further links in some sort of 'pseudo-tree' functionality. select it. This is slower than my previous method by upwards of an order of magnitude. There are a couple of things I've contemplated: add expandable folders, not just direct links, to the favorites menu. add expandable folders, not just direct links, to the start menu. add links of my favorite folders to a submenu of the start menu so that they come up when I search them. They do but this still rather cumbersome started using 7stacks - url here (I cannot link the url directly due to lack of reputation but http://www.alastria.com/index.php?p=software-7s). This is about the closest I've gotten to some sort of compact, customizeable, easy to access, tree based navigation structure. How do you power users quickly navigate to your favorite folders? Are there keyboard shortcuts I am missing? Can someone recommend other apps or addon or extensions that can achieve this sort of functionality? The Current solution (thanks to the answers below) I am going to use is a combination of Autohotkey and 7stacks - autohotkey to launch 7stacks, 7stacks with the 'menu' stack type for fast, key-enabled navigation to folders organised in a tree structure. This solves about 90% of the issue, the only issues are (note that these are really minor, I am really splitting hairs more than anything here) Can't use this for existing folder navigation (ie already have a explorer window open, want to go to another directory) A bit more cumbersome to add/remove entries to compared to xp favorites. A little slower than xp favorites. Whatever. I'm happy. Thanks guys. I think the answer is a split to John T and Kelbizzle - I've elected to give the answer to John T and +1 to Kelbizzle as I had already mentioned 7stacks.

    Read the article

  • openssl/rand.h header file not found

    - by Arun Reddy Kandoor
    I have installed libssl-dev package but that did not install the include files. How do I get the openssl include files? Appreciate your help. Checking for program g++ or c++ : /usr/bin/g++ Checking for program cpp : /usr/bin/cpp Checking for program ar : /usr/bin/ar Checking for program ranlib : /usr/bin/ranlib Checking for g++ : ok Checking for node path : ok /usr/bin/node Checking for node prefix : ok /usr Checking for header openssl/rand.h : not found /home/arun/Documents/webserver/node_modules/bcrypt/wscript:30: error: the configuration failed (see '/home/arun/Documents/webserver/node_modules/bcrypt/build/config.log') npm ERR! error installing [email protected] npm ERR! [email protected] preinstall: `node-waf clean || (exit 0); node-waf configure build` npm ERR! `sh "-c" "node-waf clean || (exit 0); node-waf configure build"` failed with 1 npm ERR! npm ERR! Failed at the [email protected] preinstall script. npm ERR! This is most likely a problem with the bcrypt package, npm ERR! not with npm itself. npm ERR! Tell the author that this fails on your system: npm ERR! node-waf clean || (exit 0); node-waf configure build npm ERR! You can get their info via: npm ERR! npm owner ls bcrypt npm ERR! There is likely additional logging output above. npm ERR! npm ERR! System Linux 3.8.0-32-generic npm ERR! command "node" "/usr/bin/npm" "install" npm ERR! cwd /home/arun/Documents/webserver npm ERR! node -v v0.6.12 npm ERR! npm -v 1.1.4 npm ERR! code ELIFECYCLE npm ERR! message [email protected] preinstall: `node-waf clean || (exit 0); node-waf configure build` npm ERR! message `sh "-c" "node-waf clean || (exit 0); node-waf configure build"` failed with 1 npm ERR! errno {} npm ERR! npm ERR! Additional logging details can be found in: npm ERR! /home/arun/Documents/webserver/npm-debug.log npm not ok

    Read the article

  • Change language encoding for file uploading

    - by jc.yin
    Previously we were running a Wordpress site on a Mac OS Server machine. We had several hundred images with Chinese characters for the image names. Now we're trying to migrate to a Ubuntu system and everything is fine except the images. Every time I try to upload an image with a Chinese name via FTP, I get the following message: "MyImage contains illegal characters. Please choose an appropriate text encoding" I have no idea how to solve this issue, do I need to somehow change the system language encoding in Ubuntu to allow for image uploading? Thanks

    Read the article

  • DELL Inspiron 9400 SD Card Reader not working on 10.11 ubuntu

    - by Mario Martz
    Im new in linux, just installed Ubuntu 11.10 in a Dell Inspiron 9400, everything works fine with the exception of the SD card reader, everytime I insert a card the computer doesnt do anything, its like the SD card reader is not there. I did a $lspci and it shows the next drivers 03:01.0 FireWire (IEEE 1394): Ricoh Co Ltd R5C832 IEEE 1394 Controller 03:01.1 SD Host controller: Ricoh Co Ltd R5C822 SD/SDIO/MMC/MS/MSPro Host Adapter (rev 19) 03:01.2 System peripheral: Ricoh Co Ltd R5C592 Memory Stick Bus Host Adapter (rev 0a) 03:01.3 System peripheral: Ricoh Co Ltd xD-Picture Card Controller (rev 05) Everytime I insert a memory card, dmesg shows the next d status 0x600b00 [ 2687.227351] end_request: I/O error, dev mmcblk0, sector 64 [ 2687.229436] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.229440] end_request: I/O error, dev mmcblk0, sector 65 [ 2687.230512] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.230515] end_request: I/O error, dev mmcblk0, sector 66 [ 2687.231588] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.231592] end_request: I/O error, dev mmcblk0, sector 67 [ 2687.232674] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.232678] end_request: I/O error, dev mmcblk0, sector 68 [ 2687.234763] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.234766] end_request: I/O error, dev mmcblk0, sector 69 [ 2687.236864] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.236868] end_request: I/O error, dev mmcblk0, sector 70 [ 2687.238942] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.238946] end_request: I/O error, dev mmcblk0, sector 71 [ 2687.238949] Buffer I/O error on device mmcblk0, logical block 8 [ 2687.241028] mmcblk0: retrying using single block read [ 2687.243104] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.243108] end_request: I/O error, dev mmcblk0, sector 64 [ 2687.245212] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.245215] end_request: I/O error, dev mmcblk0, sector 65 [ 2687.247298] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.247302] end_request: I/O error, dev mmcblk0, sector 66 [ 2687.248389] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.248393] end_request: I/O error, dev mmcblk0, sector 67 [ 2687.250476] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.250480] end_request: I/O error, dev mmcblk0, sector 68 [ 2687.252617] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 [ 2687.252621] end_request: I/O error, dev mmcblk0, sector 69 [ 2687.254737] mmcblk0: error -110 sending read/write command, response 0x0, card status 0x600b00 and more of the same but with different sector number Im using Kernel 3.0.0-12-generic By the way, when I was installing it and ubuntu asks about installation (If I want to install it along with windows or delete something or change the partitions of the HDD) if I go to the window to change the partition of the disc, linux detects the SD Card (if there's one inserted course). Any help with this it would be appreciated -sorry for my english Thank you

    Read the article

  • BlitzMax - generating 2D neon glowing line effect to png file

    - by zanlok
    Originally asked on StackOverflow, but it became tumbleweed. I'm looking to create a glowing line effect in BlitzMax, something like a Star Wars lightsaber or laserbeam. Doesn't have to be realtime, but just to TImage objects and then maybe saved to PNG for later use in animation. I'm happy to use 3D features, but it will be for use in a 2D game. Since it will be on black/space background, my strategy is to draw a series of white blurred lines with color and high transparency, then eventually central lines less blurred and more white. What I want to draw is actually bezier curved lines. Drawing curved lines is easy enough, but I can't use the technique above to create a good laser/neon effect because it comes out looking very segmented. So, I think it may be better to use a blur effect/shader on what does render well, which is a 1-pixel bezier curve. The problems I've been having are: Applying a shader to just a certain area of the screen where lines are drawn. If there's a way to do draw lines to a texture and then blur that texture and save the png, that would be great to hear about. There's got to be a way to do this, but I just haven't gotten the right elements working together yet. Any help from someone familiar with this stuff would be greatly appreciated. Using just 2D calls could be advantageous, simpler to understand and re-use. It would be very nice to know how to save a PNG that preserves the transparency/alpha stuff. p.s. I've reviewed this post (and many many others on the Blitz site), have samples working, and even developed my own 5x5 frag shaders. But, it's 3D and a scene-wide thing that doesn't seem to convert to 2D or just a certain area very well. I'd rather understand how to apply shading to a 2D scene, especially using the specifics of BlitzMax.

    Read the article

  • How to mount nexus 7 file system to view files from command line

    - by knotech
    Just got a Nexus 7, switched MTP on and plugged it in. It pops right up in nautilus, is titled Nexus 7, and lets me browse all of the files. However, on the command line, it's simply listed as /media/usbdrive and when I try to list the files, nothing comes up. Do I have to manually mount it in order to navigate the files from the command line? And why is it that nautilus recognizes it, and has access immediately, but I can't get at it from the command line?

    Read the article

< Previous Page | 230 231 232 233 234 235 236 237 238 239 240 241  | Next Page >