Search Results

Search found 13869 results on 555 pages for 'memory dump'.

Page 290/555 | < Previous Page | 286 287 288 289 290 291 292 293 294 295 296 297  | Next Page >

  • List existing file server permission groups/users

    - by Patrick
    So we have taken over a new client and their existing file server is frankly a mess. We have migrated their old file server from a 2k box to a 2k8 DFS cluster and now I'm looking at rebuilding both the folder structure and their permissions. Unfortunately its been half done with AD groups (poorly named/no description/notes) and half with individuals named in security on the folders themselves. What I'm looking to do is to dump a complete list of all the folders with their security permissions (ideally I'd like to ignore files but not essential). CACLS got me half way there but fails with an odd error message and its output isn't particularly user friendly and I'm working with roughly 2Tb/250,000 files here so I really need something that gives me a bit more functionality. Question : does anyone have any experience of something similar/know of a bit of software that might help me out?

    Read the article

  • automatically change the gnome-terminal "title" for the window

    - by tom
    Hi. Trying to change the title of a current gnome-terminal (similar to the "set title" that you can do manually") The system is running Fedora 9. The HowTo Xterm-Title discusses how to set the prompt, for an xterm. Tried to implement the escape sequences with no luck. (might be something weird..) Tried to use the gconftool to dump/change/load the changed conf attributes, and again, no luck. Also, set the PROMPT_COMMAND just in case the prompt command was somehow changing the title back (which is highly doubtful) Searching the 'net indicates that a few people have tried to solve this with no luck... I'd also like to figure out how to create a new gnome-terminal with a unique specified title... once this is solved, i'l gladly create a quick writeup/post onn how to accomplish this for others... thanks

    Read the article

  • Low FPS in some games, but hardware not fully used

    - by Mario De Schaepmeester
    I just did a little funny experiment in the game/sim "Train Simulator 2013". I normally have good FPS in it (around 30) at full settings. What I did was make a really, really long train so that the calculations the sim needed to make were enormous (the sim is quite realistic, it takes all things into account like speed/acceleration, G-forces, comfort levels, possible wheel slip and many more, and most of those things on each carriage seperately). This resulted in only 14FPS as reported by the game, but it felt more like 8FPS or so. I have a Logitech G15 keyboard which has an LCD, and it allows me to monitor CPU/RAM and video card load on it. The strange thing is, all CPU cores were busy, but the total load was only about 60% maximum at all times. The video card was only on 30% load (possibly an important note, the memory was full, which is however not unusual for the game in question). The RAM had plenty of room and there weren't many operations as it didn't grow or shrink much. I just have the feeling that the game would run smoother if it used more of my hardware power. Why is it not doing so? I had the same in another game, The Elder Scrolls: Morrowind when using more than 100 mods (that all use scripting) and a few high res texture mods, + a full-on graphics improvement program. The engine is very old (2003), and so I thought this might be the cause (not being optimised for multithreading). I had thought of possible causes, like: The operating system doesn't let the games use all the resources. It doesn't make use of multi-threading appropriately. To eliminate the former, I tried a CPU stress tool and that got 100% CPU juice as I let it run, so the OS is not the problem. I gave its thread the "higher" priority though. My actual question In both games, I did things the engine was not really built to do or support. Can those games' framerate be limited cause of their own engine not being able to cope? What is the real reason and more importantly, can I help it? And in any case, could something actually be wrong with my hardware? It's all reasonably new, a couple of months, and I (almost) never experience any other trouble. Modern and much more demanding games work absolutely fine. Specs CPU: AMD Phenom II 965 X4 @ 3.4gHz RAM: 8GB of DDR3 RAM Video: MSI GTX560 (nVidia chip) with 1GB of GDDR5 memory OS: Windows 7 Ultimate 64 bit Nothing overclocked.

    Read the article

  • Can't use HTTPS with ServerXMLHTTP object

    - by Imraan
    I am supporting a Classic ASP application that connects to a payment gateway via HTTPS. Up until recently there have been no issues. A few days ago this broke without the code, IIS config or anything local changing. Its broken on at least 3 separate servers. The last run of Windows Updates was in late November, but bringing the servers' updates up date has not resolved the problem. A code snippet is below. Dim oHttp Dim strResult Set oHttp = CreateObject("MSXML2.ServerXMLHTTP") oHttp.setOption 2, 13056 oHttp.open "POST", SOAP_ENDPOINT, false oHttp.setRequestHeader "Content-Type", "application/soap+xml; charset=utf-8" oHttp.setRequestHeader "SOAPAction", SOAP_NS + "/" & SOAP_FUNCTION oHttp.send SOAP_REQUEST Below is a dump of the error object :- Number: -2147012852 Description: A certificate is required to complete client authentication Message: A certificate is required to complete client authentication I initially posted the question on Stackoverflow (http://stackoverflow.com/questions/9212985/cant-use-https-with-serverxmlhttp-object) thinking it was a code issue, but further investigation seems to point to a server issue.

    Read the article

  • MacOS' dll equivalent

    - by kalaracey
    Hello all-- so, a DLL is similar to a folder, but it allows for multiple programs/executables to access it at once, thus conserving memory (I think). What is Mac's equivalent of a DLL? I was looking through the Google Chrome folders inside ~/Library/Application Support, and instead of the regular Windows Default.dll there was just a folder, "Default" as a regular file, with contents, I assume, would regularly be inside the DLL. Does the Mac equivalent provide the same function?

    Read the article

  • how to set up a git repository which can be accessed by network in ubuntu 12.10

    - by hguser
    Now we want to set up a private git repository in the ubuntu 12.10,then other developments can access it through the local network. Now I just can create a repository use git init,for example: cd myproject git init Which will create .git directory,but I do not know how to access it thougth network like: git://192.168.1.1/myproject/.git Any idea? BTW,I have tried: git init --bare which will give me a error: git add error : "fatal : malloc, out of memory"

    Read the article

  • App to slice'n'dice video, specifically remove chunks, on a Mac?

    - by Phillip Oldham
    I have a couple of collections of DVD Box-Sets I've ripped to my mac. Now I'd like to sweeten the viewing experience by removing the title sequences and credits so that viewing doesn't mean I have to keep reaching for the remote to skip 30 seconds of annoying music (think watching multiple episodes of Family Guy). If I can find an app that will let me do this reasonably quickly manually that would be great, but it would be perfect if I could dump a load of commands into a file and have everything trimmed while the mac is "inactive". I'm thinking that if I can specify chunks of time to remove from the original file that would be perfect. I had a quick look at importing into iMovie to do it manually and gave up at the "Processing Thumbnails" stage as it said it would be a couple of hours to produce them for a 45min mp4 file, which I can understand at 25fps but I'm not willing to wait, especially when I've got over a week's worth of files. Any suggestions?

    Read the article

  • Looking for Unix tool/script that, given an input path, will compress every batch of uncompressed 100MB text files into a single gzip file

    - by newToFlume
    I have a dump of thousands of small text files (1-5MB) large, each containing lines of text. I need to "batch" them up, so that each batch is of a fixed size - say 100MB, and compress that batch. Now that batch could be: A single file that is just a 'cat' of the contents of the individual text files, or Just the individual text files themselves Caveats: unix split -b will not work here as I need to keep lines of text intact. Using the lines option is a bit complicated as there is a large variance in the number of bytes in each line. The files need not be a fixed size strictly, as long as it's within 5% of the requested size The lines are critical, and should not be lost: I need to confirm that the input made its way to output without loss - what rolling checksum (something like CRC32, BUT better/"stronger" in face of collisions) A script should do nicely, but this seems like a task someone has done before, and it would be nice to see some code (preferably python or ruby) that does atleast something similar.

    Read the article

  • looking for the best power supply-- building computer--- [closed]

    - by fello
    What would be an appropriate power supply and form factor for those specifications below? CPU -Intel Core i7-950 Bloomfield 3.06GH Motherboard -- ASRock X58 Extreme 3 LGA ATX Hard drive-- Seagate Barracuda 7200.11 ST31500341AS 1.5TB 7200 RPM SATA 3.0Gb/s 3.5" Memory-- Kingston HyperX 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Processor---Intel Core i7-950 Bloomfield 3.06GHz LGA 1366 130W Quad-Core

    Read the article

  • Are these tools a gimmick?

    - by dotnetdev
    Hi, Are tools which analyse which enable you to use all your CPU cores (eg: GBP.htm"http://www2.ashampoo.com/webcache/html/1/product_2_0061_GBP.htm) and also tools which help you to regain memory (can't think of any just yet but seen plenty) a gimmick? Do these tools really work? Thanks

    Read the article

  • Resource consumption of FreeBSD's jails

    - by Juan Francisco Cantero Hurtado
    Just for curiosity. An example machine: an dedicated amd64 server with the last stable version of FreeBSD and UFS for the partitions. How much resources consume FreeBSD for each empty jail? I mean, I don't want know what is the resource consumption of a jailed server or whatever, just the overhead of each jail. I'm especially interested on CPU, memory and IO. For a few jails the overhead is negligible but imagine a server with 100 jails.

    Read the article

  • How to optimally configure memcache running on 16 cores 144G ram server?

    - by Ivko Maksimovic
    Memcache is the only important app running on the server Server has 16 cores and 144G RAM Memcache is given 135G Memcache runs at 32 threads Gigabit network, test shows at least 300Mbit/s availability on network port 600 connections 3000 requests per second Say that memcache (memory) usage is at 50% - it's definitely not full As we increase number of requests towards server, requests slow down (from 8ms to 100ms per request) but server load remains 0.00. We suspect this can be solved by adjusting configuration but we don't understand many of the configuration parameters (besides, maybe, the number of threads). Any ideas?

    Read the article

  • Download or view a servers wins database

    - by Segfault
    I am trying to troubleshoot a WINS browsing problem in a Server 2008 AD Forest. I am in one domain and the problem is with a sibling domain. What command can i use to dump or view the WINS database on a particular AD server by name, in a different domain than me? I thought one of the subcommands of net would have an option for this, but I can't find it. I also tried browstat.exe getblist but it gives me an error message "The list of servers for this workgroup is not currently available". I am not a domain admin and don't have any rights to the either domain other than a normal user. Anyone know how this can be done?

    Read the article

  • Are there any USB flash drives or SD cards which use RAID or redundant storage for additional reliability?

    - by Luke Dennis
    I'm looking to get a fault-tolerant USB flash drive, which saves data to multiple independent locations, whether using RAID or some other means to back up data. Has a product like this ever been created, or are my only options to hack something together? (By the way: I'm aware that RAID doesn't prevent data corruption from software or the file system. I'm just looking for something that can handle one of the memory sticks going dead.)

    Read the article

  • xp stop error 50 page fault in non paged area

    - by Tony
    I have a laptop that fails to boot with the BSOD error: "page_fault_in_non_paged_area" STOP: 0x00000050 (0xEC6B738D, 0x00000000, 0x8649308C, 0x00000000) The laptop has 2 memory DIMMs. I removed each DIMM one at a time and the error remained with just one DIMM installed. I have run spinrite 6.0 on the hard drive no errors found. Booted to recovery mode and ran CHKDSK /R, it found and fixed errors but still gets the stop error. Any other suggestions to try?

    Read the article

  • Excel freezes when copying / cutting to paste elsewhere

    - by Barry
    When cutting/copying some cells to paste them into another sheet/page, sometimes Excel freezes/locks up and fades out. At the top toolbar it says in brackets "not responding". Eventually, I must click 'X' to close the program. It offers to wait for the program to respond, but never does – it just does nothing until I finally close it, where it offers to recover files etc. Is there an issue with memory here? What can I do to stop it locking up?

    Read the article

  • Cron job failing to backing up a Postgres database

    - by user705142
    I'm unsure what's going on here: I've got a backup script which runs fine under root. It produces a 300kb database dump in the proper directory. When running it as a cron job with exactly the same command however, an empty gzip file appears with nothing in it. The cron log shows no error, just that the command has been run. This is the script: #! /bin/bash DIR="/opt/backup" YMD=$(date "+%Y-%m-%d") su -c "pg_dump -U postgres mydatabasename | gzip -6 > "$DIR/database_backup.$YMD.gz" " postgres # delete backup files older than 60 days OLD=$(find $DIR -type d -mtime +60) if [ -n "$OLD" ] ; then echo deleting old backup files: $OLD echo $OLD | xargs rm -rfv fi And the cron job: 01 10 * * * root sh /opt/daily_backup_script.sh It produces a database_backup file, just an empty one. Anyone know what's going on here?

    Read the article

  • Rolling Apache2 log files

    - by Andrew B
    I'm using a Collabnet svn distribution on linux, and the log files are configured through the standard apache httpd.conf. It's been a while since I dealt directly with apache, but my memory and google seem to indicate that the only way to rotate apache log files is outside of apache, using a periodic script. Is there some convenient way I'm missing to rotate these?

    Read the article

  • Xen command xl doesn't create a vm but xend/xm does

    - by ineff
    I'm a newbie to Xen, and I've recently installed Xen 4.2 by sources on my system. I've found a strange thing I've a VM when I start it via the command "xm create machine.cfg" all work fine, but if I use "xl create machine.cfg" it gives me the following error xc: error: panic: xc_dom_core.c:442: xc_dom_alloc_segment: segment ramdisk too large (0x4ba 0x2000 - 0x1bd9 pages): Out of memory libxl: error: libxl_dom.c:208:libxl__build_pv xc_dom_build_image failed: Invalid argument cannot (re-)build domain: -3 xenconsole: Could not read tty from store: No such file or directory What could be the problem? Any idea?

    Read the article

  • Is it safe to run two instances of svnserve on one repository, or only one?

    - by fredden
    We've two nodes running heartbeat/drbd, and one of the services we're using is subversion. What I want to know is: is it safe to run svnserve on both nodes all the time, or should it only run on the active node? Does svnserve use file-level locking, or is it all in memory? What are the implications of running svnserve without its repositories accessible? Please let me know if this isn't clear, and I'll try my best to rephrase/clarify. :)

    Read the article

< Previous Page | 286 287 288 289 290 291 292 293 294 295 296 297  | Next Page >