Search Results

Search found 73708 results on 2949 pages for 'file systems'.

Page 148/2949 | < Previous Page | 144 145 146 147 148 149 150 151 152 153 154 155  | Next Page >

  • Sanity checks vs file sizes

    - by Richard Fabian
    In your game assets do you make room for explicit sanity checks, or do you have some generally expected bounds which you assert? I've been thinking about how we compress data and thought that it's much better to have the former, and less of the latter. If your data can exceed your normal valid ranges, but if it does it's an error, then surely that implies you're not compressing the data well enough? What do you do to find out if your data is compressed as far as it can be, and what do you use to ensure your data isn't corrupted and ensure it's an official release? EDIT I'm not interested in sanity checking the file size, but instead, how you manage your sanity checks and whether you arrange the excess size caused by the opportunity to do sanity checks by using explicit extra data, or through allowing the data enough file space (data member size) to be out of valid range and thus able to be checked merely by looking at the asset in memory after loading.

    Read the article

  • adding executable file to path and launching from terminal directly

    - by ubunnttuu
    i just downloaded sublime text for my ubuntu, and its working fine. i have the executable file in my ~/sublime folder. now i would like to invoke this app from the terminal by just typing sublime. i suppose that is possible, pls let me know how. also, since this application did not need any configure/make/install (i just had to extract it and then use the exec file to run the app), i cannot add this to my gnome launcher fav apps panel. how can i put the app shortcut there, so that when i do a top-left and then type sublime, the results will show the applicaion and then i can click on it and invoke the app from there? thanks!

    Read the article

  • Execure a random command from .txt file?

    - by Alberto Burgos
    I have a Ubuntu server, and I'm trying to print a Twitter quote using the app "twidge". So I made a list of tweets on a .txt file. I want to print one tweet (per line) from that file and send it to Twitter via twidge (or what ever other method was possible). I can print a random phrase with shuf: shuf -n 1 /var/www/tweets.txt and it works. It sends me back one of the tweets, but, it does not send it to Twitter, even if the "in line" phrase is a command. i.e: twidge update "bla bla bla" It just prints on the screen, but don't send it to Twitter. I tried turning the .txt to .sh, but don't work... any idea? by the way, i want to use it with crontab, something like this: 15 * * * * shuf -n 1 /var/www/tweets.txt

    Read the article

  • Can only connect to file server on second attempt

    - by Ross Fleming
    I have a FreeNas file server on my local network and I usually connect to it from Windows and Ubuntu computers. Ever since I have upgraded from Ubuntu 12.04 to 12.10 Ubuntu will only connect after a second attempt. By which I mean, I will browse to it via the file manager and once I click on the link in "Bookmarks" it complains that it could not connect. If I then try again it connects successfully and will keep up it's connection until the laptop is suspended or looses connection to the LAN for whatever reason. This isn't much of a problem as I don't mind having to click twice but my real problem is that this means that my scheduled backup will complain that it cannot connect to the storage device if it has not already been accessed during the current session. If there is some way to either stop the issue all together or to force the backup tool (default) to immediately have a second attempt at connecting.

    Read the article

  • Windows Server - share files without access for administrator

    - by Pawel
    We have a MS Windows Server 2008 R8 based server that is administrated by our IT department. We would like to achieve two things simultaneously: A folder on the server, containing several thousand files (new files added frequently) that is accessible to some ActiveDirectory users (e.g. board of directors) but is not accessible by IT department employees IT department employees still maintain rights to administrate the server, including installing new software and services We already checked some solutions: Using NTFS access rights. Unfortunately IT (members of "Administrators" group) can set themselves as new owners of the files and change the permissions so that they gain access to the files. Enabling EFS. Unfortunately even if you do not allow IT to access files, they still can disable EFS completely because they have administrative rights. Moreover as far as I know you have to manually add permissions for all users but the owner for each new file - very inconvenient. Creating a new role for the IT department that has all the privileges apart from taking ownership of files. Unfortunately if you're not a member of the Administrators group, you cannot install new software, no matter what privileges you add to the role. TrueCrypt - nice free encryption software, but with poor sharing capabilities. You can either mount an encryption container on the server (and then IT has access to its contents) or you mount them locally but only one user can mount it for writing. AxCrypt - free encryption software that enables file-by-file encryption on the server. There are some disadvantages though - you have to manually encrypt each new file added. The files have their extensions changes. You can only set one password for all files (so all users have to know this one password). Any other ideas? Our budget is limited so enterprise-class software from Symantec or PGP would probably be not an option.

    Read the article

  • VLC doesnot play any file [ any video/.mp3] in local machine , closes when a file is opened

    - by hsemarap
    When I run vlc from terminal I get: In the VLC dialog box: Your input can't be opened: VLC is unable to open the MRL 'file:///media/Ent/movies/the%20mask.avi'. Check the log for details. In terminal: VLC media player 2.0.1 Twoflower (revision 2.0.1-0-gf432547) [0x8fe8f8] main input error: open of `file/xspf-open:///home/para/.local/share/vlc/ml.xspf' failed [0x8fe8f8] main input error: Your input can't be opened [0x8fe8f8] main input error: VLC is unable to open the MRL 'file/xspf-open:///home/para/.local/share/vlc/ml.xspf'. Check the log for details. [0x8f1aa8] main interface error: no suitable interface module [0x8f9b08] main interface error: no suitable interface module [0x8be008] main libvlc error: option http-user-agent does not exist [0x8be008] main libvlc: Running vlc with the default interface. Use 'cvlc' to use vlc without interface. [0x8f9b08] qt4 interface error: Unable to load extensions module [0x7f5280000b78] main input error: open of `file:///media/Ent/movies/the%20mask.avi' failed [0x8fc718] main playlist error: could not export playlist

    Read the article

  • Mavericks permission issues with Windows Server deduplicated shares

    - by dmohlmaster
    We have a number of 10.9-10.9.3 - Mavericks - machines installed throughout our facility. Much of the user content is pulled from shares stored on our Windows Server 2012 fileservers with deduplication enabled. I have found that files newly written or unoptimized are able to be accessed without issue - read, written, modified, etc. Once the file gets optomized/deduplicated and Windows adds the P & L attributes - sparse and symlink - the Macs running Mavericks begin to have access issues. Once the files get deduplicated, users begin receiving read access errors when copying files (see error1 below). This happens when copying to folders within the current folder tree or copying somewhere to the local system. If you 'stop' the copy operation and retry a few more times, it may eventually work for the specific instance but fail again later. I am however, able to copy these files without issue via the terminal. Other systems running 10.7 do not experience the same issues and are able to access file server resources without issue. Many of the systems having issues are newer and thus not able to be downgraded to 10.8 or 10.7. I have tried finder replacements such as Pathfinder but the results are the same. I know this is at least similar to the issues many Mac users are already experiencing and posting about but I haven't seen it directly linked to deduplication and the attributes written by Windows server. Has anyone seen this issue? Have any solutions been found? Error 1: When copying files after the PL attributes have been set by deduplication. "One or more items can't be copied to "Foler" because you don't have permissions to read them. ******************************************' Via the system.log, I am also seeing the following error when accessing these deduplicated file shares. The reparse point tag listed below is "IO_REPARSE_TAG_DEDUP" Reported error: "smbfs_nget: filename.ext - unknown reparse point tag 0x80000013"

    Read the article

  • ffmpeg: cut multiple input files with seeking to one output file

    - by Josef Kufner
    I have list of video files (loaded from database), each with start and end time of requested interval: # file begin end v1.mp4 1:01 2:01 v2.mp4 3:02 3:32 v3.mp4 2:03 5:23 And I need to create single video file containing these intervals: [0:00]---v1---[2:00]---v2---[2:30]---v3---[5:50] I preffer usig ffmpeg, since it is installed on server. Caller program is written in PHP. It is easy to cut one input to one output (argument escaping removed for clarity): exec("ffmpeg -ss $begin -i $input_file -ss $begin -c copy $output_file"); I there any easier way than executing ffmpeg for each interval and then execute it once more to concatenate prepared clips together? I really do not like to have a lot of temporary files or dealing with complex process handling.

    Read the article

  • How to convert an XFS file system to HFS+

    - by user219350
    I have repeatedly convinced of the reliability of the XFS file system , and I was more than satisfied . I was happy with everything in Ubuntu 14.04 ( great software) , but there is a little "but ! " Basically, I work on OSX-Mavericks 10.9.3, which sees very Windows 8.1 and works wonders with NTFS, but does not see Ubuntu! Briefly describe the equipment: ASRock B75 Pro3-M i5 3330 GeForce GTX 650 Ti SATA 500GB running OS X Mavericks + Clover - a boot disk Toshiba 2TB running Windows 8.1 (x64) and Ubuntu 14.04 (amd64) If you boot from the Toshiba (where there is Ubuntu and boot Windows + GRUB) after restart boot from Clover, it is impossible. Tried a lot of options - as Clover installation and boot priority, and various settings for GRUB, but have not found an acceptable option and have no desire to reinstall again Clover (Mavericks reboots 20 seconds - excellent!) So please help on the file system - how to convert from XFS to HFS+ journaled. Mavericks to saw it all synced on Mac. Thank you for the sensible answer and help! Originally in Russian.

    Read the article

  • After changing web host, I get a 'file does not exist' error

    - by Jordan
    I run a WordPress blog, and have recently changed web hosts. When changing web hosts, I copied all files and exported/imported the database etc as explained by lots of tutorials found easily on Google. The blog home page works fine. What goes wrong: When I click on any link from the home page, the browser gets stuck in a redirect loop. Looking at the error log, I see: File does not exist: /usr/local/apache/htdocs/index.php The directory /usr doesn't even exist for my website - so perhaps this is looking for a file that was present using my old Web Host and is no longer present with my new web host? What is going on, and how might I resolve it?

    Read the article

  • Locate a user's bashrc file

    - by Starkers
    Really confused. Upon running cat etc/passwd I have found this: postgres:x:117:126:PostgreSQL administrator,,,:/var/lib/postgresql:/bin/bash meaning I have a postgres user, right? I want to change the bashrc environment file of this user to make commands available to it. /var/lib/postgresql doesn't contain a bashrc file, and /bin/bash doesn't contain it either, so I don't really know what's going on. All I know is a created postgres using the useradd command, so why do I have some weird user with no home directory? So confused :(

    Read the article

  • batch copy files with error log on missing permissions

    - by sc911
    Hi *, I'm searching for a tool to batch-copy files, that should support the following points: copy files from a net-share report any errors show errors only or filter log on errors don't stop on an error also report if a file or a folder could not be copied due to missing permissions if possible it should have a queue where new job can be added while copying I tried the following tools: TerraCopy: takes a lot time to just calculate the time and the size of the job and does not report errors due to missing permissions (it doesn't even add those files to the copy-queue) Karne's replicator: does not report errors due to missing permissions xcopy: does a great job when using the right parameters and piping the output to a file (in the German localization xcopy /k /r /e /i /s /c /h SOURCE TARGET>LOGFILE 2>&1 will do the job. opening the logfile in IE will give you a great monitor). but quing jobs it not possible (ok, you can join them all in a batch-file, but you can not queue jobs while another one is running (hm, thinking of a batch-script that loops through a file with the source-target-config...)) to be continued Which tools do you use? Tell me! Thx sc911

    Read the article

  • Synchronize two directories on linux pc

    - by Gab
    I need a distributed filesystem (or a synchronization tool) that is capable of keeping a directory synchronized across 4 pc. My requirements are: offline access (data must be available offline on each pc) preserve execution rights: some files are marked executable on a linux partition. This flag should be replicated. efficient sync strategy: some of my files are 20GB, they are changed quite often, but in very little parts (Virtualbox images). Delta transmissions are welcome. efficient handling of space: no history for files, files shouldn't be copied to temp directories "just in case you break it". it must propagate deletions of files modification can happen in any of the 4 pcs, they should be propagated when other pc are connected. Other specs of my solution are: Sync is over a lan, the total amount of data to be synced is around 180GB, in some ten thousand files. Changes are small, but can happen in big files. At the moment i'm interested in a linux only solution. conflicts either don't happen or are solved with "last one wins" I haven't found any good solution. I've been trying: unison: it is the only one working at the moment, but during the hashing phase it hangs my pc for some minute, disk light steady on. Sparkleshare doesn't handle large files nicely. It keeps an history of all your changes that grows up indefinitely. They promise it will be fixed in next releases, but at the moment it still doesn't fit my needs. owncloud (keeps history of each file i change) coda ? (help! i couldn't set it up correctly!) git-annex assistant transforms all your files in symlinks and mark the original file as read only ("just in case you make a mistake while you modify it"!). Before you edit a file you have to issue a special command "git-annex unlock", that creates a local copy of the file, and you have to remember to lock it again if you want it synchronized. What to try next?

    Read the article

  • Best alternative of Property file in Java

    - by Ranna
    Hey I an working on the product which is live at multiple portals. The product is developed in GWT, JAVA, Hibernate. My question is : Whether there is any alternative of using property file in java. My Requirement : For one property key there are multiple values in live portal for each different portal. Each time I change property file, I need to make the war again. The loading of any of the property should not be time-consuming. Any help or suggession would be apprecialble !!!

    Read the article

  • Where is the start up file located?

    - by starcorn
    Hello, I want to add some lines which should execute every time Ubuntu boots up, so I don't have to change them manually everytime. I've read in some place that you should edit this file /etc/rc.local. However when I add the lines I want to execute at start up it doesn't run those lines. So I wonder where the start up file is located in ubuntu? Those lines I want to add is to change the sensitivity for the trackpoint One of the lines I want to add: echo -n 250 > /sys/devices/platform/i8042/serio1/serio2/sensitivity

    Read the article

  • Windows Server - share files without access for administrator

    - by Pawel
    We have a MS Windows Server 2008 R8 based server that is administrated by our IT department. We would like to achieve two things simultaneously: A folder on the server, containing several thousand files (new files added frequently) that is accessible to some ActiveDirectory users (e.g. board of directors) but is not accessible by IT department employees IT department employees still maintain rights to administrate the server, including installing new software and services We already checked some solutions: Using NTFS access rights. Unfortunately IT (members of "Administrators" group) can set themselves as new owners of the files and change the permissions so that they gain access to the files. Enabling EFS. Unfortunately even if you do not allow IT to access files, they still can disable EFS completely because they have administrative rights. Moreover as far as I know you have to manually add permissions for all users but the owner for each new file - very inconvenient. Creating a new role for the IT department that has all the privileges apart from taking ownership of files. Unfortunately if you're not a member of the Administrators group, you cannot install new software, no matter what privileges you add to the role. TrueCrypt - nice free encryption software, but with poor sharing capabilities. You can either mount an encryption container on the server (and then IT has access to its contents) or you mount them locally but only one user can mount it for writing. AxCrypt - free encryption software that enables file-by-file encryption on the server. There are some disadvantages though - you have to manually encrypt each new file added. The files have their extensions changes. You can only set one password for all files (so all users have to know this one password). Any other ideas? Our budget is limited so enterprise-class software from Symantec or PGP would probably be not an option.

    Read the article

  • File access forbidden in htpasswd

    - by Nerd-Herd
    I have been using the htpasswd generated in this question and it seemed to have been working well until recently. Since yesterday, I am not able to access the newest file created in the folder ChatLogs(named 10_07_2012.txt). The server returns a 403 Forbidden error saying: Forbidden You don't have permission to access /ChatLogs/2012/07/10_07_2012.txt on this server. I am still able to access older files(until 09 July, 2012). At first I thought it might be because of file permissions, but they are the same as on other 9 files in the folder. What could be the problem? Please Help.

    Read the article

  • Web Platform Installer issues deploying Azure SDK 1.4 on refreshed systems.

    - by Enrique Lima
    Recently I have been doing quite a bit of testing on different means to deploy the Azure SDKs and such. After a very successful couple of systems, I started running into issues last night. Here is the problem, if I go to the Windows Azure Website, and go to Develop, then click on the SDK and Tools, then Get Tools & SDK, it launches the Web Platform Installer.  All seems well at that point, except it will go through the initial process, will find the SDK files for 1.4, but since the tools for Visual Studio are still 1.3, the location throws back a 404, which causes the Installer to fail.  NOTE:If you already had SDK 1.3 and the tools in place, it will go through. The fix is to go directly to the Microsoft Download Center location and download the files.  Here is the link … http://www.microsoft.com/downloads/en/details.aspx?FamilyID=7a1089b6-4050-4307-86c4-9dadaa5ed018

    Read the article

  • Sharing between Vista and Windows 7

    - by Metro Smurf
    Vista Ultimate 32 bit Windows 7 Ultimate 64 bit I've read through similar questions about sharing between Win7 and Vista, but none of them have resolved my issue of not being able to share between Win7 and Vista: Connecting to a Vista shared folder from Windows 7 Networking Windows 7 and Vista Enable File sharing in Windows Vista Previously I had previously had my Vista and XP system sharing back and forth without any problems. I was able to access the shares without entering a user name / password in the NT challenge prompt (note: account names and passwords were different on the Vista and XP systems). Currently I replaced my XP system with a Win7 system. Now, when I attempt to access shares to/from Vista / Win7, I am continually prompted with an NT challenge to enter my credentials. Things I've Verified/Tried Both systems are on the same workgroup. Win 7 is using the Home network. Vista is using the Private network. In other words, neither system is using a Public network profile. Enabled file sharing with and without password protection on both Vista and Win7 Tried HomeGroup Connections (win7) with Windows to manage connections and Use user accounts to connect. Reviewed too many online articles to count to trouble shoot. Set the shares to have full control by everyone. Set up the shares directly on the directory and through the share manager. My Question How can I enable file sharing between Vista and Win7 without being prompted with a username/password challenge, ever?

    Read the article

  • Guide to installing a fully encrypted file system?

    - by Michael Stum
    I have a little Netbook on which I want to install Ubuntu 10.10 (32-Bit) on. However, since it is a portable PC I want to completely encrypt the file system (in case of theft). Currently it runs Windows 7 Starter and I use TrueCrypt which installs a custom boot loader that asks for the password. I remember from the past that Linux can do that as well by putting /boot on it's own, unencrypted partition. Since it's been ages since I last worked with file system encryption (I remember setting up LVM and a custom patched grub to ask for the password) I wonder how that would work nowadays and if there is a step-by-step how-to for it?

    Read the article

  • Strategies for removing register_globals from a file

    - by Jonathan Rich
    I have a file (or rather, a list of about 100 files) in my website's repository that is still requiring the use of register_globals and other nastiness (like custom error reporting, etc) because the code is so bad, throws notices, and is 100% procedural with few subroutines. We want to move to PHP 5.4 (and eventually 5.5) this year, but can't until we can port these files over, clean them up, etc. The average file length is about 1000 lines. I've already cleaned up a few of the low-hanging fruit, however the job took almost an entire day for 2 300-500 line files. I am in a quagmire here (giggity). Anyway, has anyone else dealt with this in the past? Are there any strategies besides tracing backwards through the code? Most static analysis tools don't look at code outside of functions - are there any that will look at the procedural code and help find at least some of the problems?

    Read the article

  • Help file formats - MSHA files v CHM files

    - by TATWORTH
    Recently I was tasked with producing a help file from a C#/WPF/Crystal Reports application using Sandcastle. I have previously blogged about the problems in doing that and the change that is going into the next version of Sandcastle that allows the vagaries of Crystal (this missing BusinessObjects.Licensing.KeycodeDecoder) to be handled. At http://social.msdn.microsoft.com/Forums/en-US/devdocs/thread/0b110502-f5bb-4c56-96a5-4347a2a7a68a/, I describe how I tried each of the formats. Two of the formats could not be built and the error messages were not exactly helpful as to the cause. These two formats turned out to be obsolete. The MSHA format worked but was not suitable for a standalone application, so that left me with the older CHM format. I therefore asked on that thread "will the HTML Help 1 (CHM) format continue to be supported for the foreseeable future?".Rob Chandler, MVP in help systems, gave a very helpful answer, to the effect that there is not yet a replacement for the CHM format.

    Read the article

  • Website .htaccess file for Wordpress sub folder

    - by ubique
    I developed a Flash website for a client and added the following .htaccess file in the root directory and the non-www to www redirect works perfectly. RewriteEngine On RewriteCond %{HTTP_HOST} ^website.com [NC] RewriteRule ^(.*)$ http://www.website.com/$1 [L,R=301] I was also asked to add a Wordpress blog so I put it in a new directory folder (as opposed to a sub domain) with so the URL is www.website.com/blog Does Google now see the main site and blog as two different websites? Do I need to link them together using another .htaccess file in the Wordpress Root so Google automatically crawls the whole domain? Any help appreciated....

    Read the article

  • How to find what files / directories are not copied yet?

    - by user8676
    Hi all, I found the following 'nice' situation: An archive of few disks (actually three disks) which has a bunch of photos (more or less) organized. Well, this is good. A big disk shared on a network which has a bunch of photos which has another folder structure (even if is somewhat recognizable for a human being) than the archive described above, but some of the files on this big network share are the same with the files from the archive. Well, this is bad. What we need is to move the different (new) files from the network share in the archive (perhaps we'll use for this a new disk added to archive). The program that we need is different from a regular File Duplicate Finder program because usually the File Duplicate Finder finds the duplicates from all sources comparing each file with another. We want to find the differences between the two sources. It is fine for us to have a report generated in text file which after this we'll use to do our move. A Windows solution will be preferred. Any ideas? TIA

    Read the article

< Previous Page | 144 145 146 147 148 149 150 151 152 153 154 155  | Next Page >