Search Results

Search found 21702 results on 869 pages for 'large objects'.

Page 283/869 | < Previous Page | 279 280 281 282 283 284 285 286 287 288 289 290  | Next Page >

  • Enlarging everything on 16" 1920x1080 notebook display in Windows 7

    - by Rob
    Does Windows 7 have an option to enlarge everything on the screen, e.g. via a spi setting? How well does this work? i.e. do the objects look clear when enlarged? I ask this because sometimes software that enlarges non-photographic bitmap images e.g. icons and symbols can leave them artifacted with harsh jaggier slopes and blurred lines. I've tried the dpi setting in Windows XP but it doesn't enlarge everything, and some things are not so clear as described above. I'm looking at a notebook/laptop with this spec. I've already enjoyed using a 15.4" 1920x1200 display for 5 years. I've tried the dpi setting in Windows XP but it doesn't enlarge everything, and some things are not so clear as described above. I am buying a laptop for my father who will probably prefer larger objects on the screen, although I want to provide some future proofing by allowing more on the screen if needed. I'm not interested in answers that debate the effectiveness or otherwise merits of 1920x1080 on a 16" display, please. The alternative option of 1366x768 seems too little.

    Read the article

  • Formatting a memory stick with two partitions?

    - by Marius
    I have a 16GB memorystick which used to have a Linux partition. It therefore has two partitions; 2GB FAT32 and 14GB linux boot drive. The linux part stopped working, so I decided to reinstall it. But windows can't see that partition. I tried formatting the whole disk, but I can only format one partition (the FAT32). There seems to be no way to combine the two partitions into one big one, and there seems to be no way for windows to partition the large part of the memorystick to but Linux on it. In the windows partition manager, windows sees the large unused partition, and it let me delete it. But once I have deleted it, I'm not allowed to format it. Also I cannot delete or resize the small partition. So, to summarize: I have a memorystick with two partitons. Windows only sees one of them, and won't let me use the other one. I would like to combine the two partitions so I can install Linux on the memory stick again.

    Read the article

  • Problems when pasting Outlook 2010 signature logo into message body

    - by Austin ''Danger'' Powers
    Whenever I paste my company logo into a message in Outlook 2010, I run into a variety of complications and anomalies. The dimensions of my original logo image are 315x174 (source image is a PNG file). I am scaling this image down in Photoshop CS6 to a variety of smaller sizes for testing my Outlook signature (300x166, 250x138, 200x110,150x83 and 100x55 pixels). 300x166 = no distortion. This looks the same as in Photoshop (but far too large to use in my signature). 250x130 = distorted (gets stretched much wider by Outlook when pasting into message body). 200x110 = looks reasonable, but seems to have been scaled to a different size (smaller) by Outlook for no obvious reason. 150x83 = for some reason, this is scaled by Outlook to the exact same size that 200x110 was scaled to. In fact, a large range of similar dimensions are scaled to the exact same image size by Outlook. This is very frustrating. Why is this happening and what can be done to prevent it? 100x55 = when pasting my logo from Photoshop to Outlook with these dimensions all that happens is the cursor jumps forwards about an inch on the screen, leaving a blank space where the image was supposed to go. Any advice would be much appreciated.

    Read the article

  • Did my registrar screw up or is this how name server propagation works?

    - by Brad
    So my company has a number of domains with a large registrar that shall go unnamed. We are making some changes to our DNS infrastructure and the first of those is we are moving our secondary DNS from one server on site to four servers offsite. So we updated the name servers for each domain at the registrar by removing the entry for the old secondary name server and adding the four new ones. I monitored the old secondary server for requests and when I saw no new requests had been made for 24 hours I shut it down. That was this morning. I assumed at this point everything was good. Unfortunately this was my mistake. I should have gone and made sure name servers at large were returning the correct NS records. So this afternoon we were performing maintenance on our primary DNS server and we shut it down. This is when I started getting alerts from our external monitoring. I checked and sure enough, the DNS server used there reported the only NS record for our primary domain was the primary name server. The new secondary servers were not listed and neither was the old secondary. Is it unreasonable of me to have assumed that because the update was from ns1.mydomain.com ns2.mydomain.com to ns1.mydomain.com ns1.backupdns.com ns2.backupdns.com ns3.backupdns.com ns4.backupdns.com in one step at the registrar that there should be no intermediate state where the only NS record was for ns1.mydomain.com? Going forward to be safe obviously I will always leave the old name servers alone until after I'm 100% sure the new ones have propagated and only then remove the old name servers from the registrar. However, I'd still like to know if my registrar screwed up or if my expectation was unreasonable.

    Read the article

  • NTFS frequent corruption when writing many small files, index $I30 error

    - by david sedai
    I'm running Windows 7 Ultimate on a laptop with a 500G HDD, and had all partitions formatted as NTFS. I do a lot of programming and LaTeX typesetting, both of these involves a large amount of reading/writing/deleting to a lot of small files, such as C++ library headers or LaTeX packages. The problem is that frequently, when there is a large number of writing to files, the partition being written to often corrupts, the chkntfs e: returns dirty, where e: is the partition being written. I've re-formatted the drive, I've contacted the laptop manufacturer and had the HDD checked, the HDD is not faulty, there are no bad sectors, and I've tried a brand new HDD, to no avail, and the other partition on the same physical drive doesn't have this issue. I'm pretty sure that it's no hardware related. I've searched the Microsoft support pages, one page http://support.microsoft.com/kb/982018 provides an update for Advanced Format Disks, which I've already installed. The chkntfs log shows $130 index errors. I'm at a loss here. Can anyone help? Thanks in advance.

    Read the article

  • Looking for an application that scrolls or pans netbook screens running Windows

    - by therobyouknow
    I'm looking for a Windows 7 and XP compatible Windows desktop panning/scrolling tool. This is to solve a problem where some applications for example MSN have settings/preference Windows that are not resizeable. I have a Netbook with a small maximum screen resolution e.g. 1024x600. The fixed non-resizeable windows are too large for this display screen size so I cannot see all of the items on these windows, particularly the OK button to save settings. What I would like is a desktop scrolling/panning tool where if I move my mouse pointer to any edge of the display, it pans to show the region of the too-large-fixed window that I could not see. I use a Samsung N110 and Toshiba NB100 netbooks. I'm looking for: A general program that provides desktop panning/scrolling/expanded resolution to allow all regions of a non-resizeable fixed window Preferably a non-graphics hardware specific program but will accept a solution that works with both the above machines I'm NOT looking for (i.e. unsatisfactory answers others have asked that I've already searched and found): Advice on what programs to use that DON'T have the problem of fixed windows Alternative operating system solutions Plugging in an external monitor with larger resolution - I use this option but I need a solution when one is not available, e.g. while travelling etc Advice about not using small screen netbooks - I enjoy the compact convenience of them Advice about change the dpi settings in the Control Panel Display settings Advice about guesswork with the tab key to move the focus the off-screen item I cannot see Thank you in advance.

    Read the article

  • Steps to deploy a custom routing protocol

    - by user134589
    I'm a Ph.D Student and I'm researching a Service Centric Networking architecture with resourceallocation on a large scale. What I'm looking to do is expand an existing routing protocol like OSPF with extra fields and some new message types that I need for communication between Nodes. I want to manipulate the cost of a network link and I want paths to be calculated like in OSPF V2/v3, but using the cost that my algorithms have calculated. What I have I have the source code of OSPF from Quagga. I am assuming I can edit this code how I want, including packet structures and creating new types. Yes, I am aware it won't be easy but this is a 6 years research project and I am eager to develop something new, to move forward. What I need I would like to know how I can deploy the edited OSPF source files I have (written in C) on any type of server. I have a large testbed environment available with hundreds of virtual nodes and pretty much any OS out there. So if I want to test my extended protocol, how do I make all the nodes in a network use this to communicate? I do not understand what parts of the kernel I need to edit here. I tried searching for days now and I am unable to find how to deploy a non-existing routing protocol, without the use of an application-level framework. If somebody could push me in the right direction that'd be awesome. note: I need this to be a routingprotocol and not an application, since I want this to work on op of the network layer for performance reasons. Thanks!

    Read the article

  • Disk doesn't contain a valid partition table

    - by Jeevan Dongre
    I was running a m1.small instance ec2 ubuntu instance. I was running out of disk space, so I upgraded my instance to medium. When I upgraded I actually got 429.5 GB of space and after that I added 10 gb of volume too. When I run the "sudo fdisk -l" command I got this results. Disk /dev/sda1: 8589 MB, 8589934592 bytes 255 heads, 63 sectors/track, 1044 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x00000000 Disk /dev/sda1 doesn't contain a valid partition table Disk /dev/sda2: 429.5 GB, 429461078016 bytes 255 heads, 63 sectors/track, 52212 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x00000000 Disk /dev/sda2 doesn't contain a valid partition table Disk /dev/sdf: 10.7 GB, 10737418240 bytes 255 heads, 63 sectors/track, 1305 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x00000000 sda1 is the primary parition and sda2 is what I got added upgrading my system to medium. But the problem persists, I am not able to pull the code from git, it is giving me this error. remote: Counting objects: 409, done. remote: Compressing objects: 100% (236/236), done. fatal: write error: No space left on device fatal: index-pack failed

    Read the article

  • Recommended Smartphone for Reading PDFs [closed]

    - by mika
    This is as much a software than a hardware question. I use a lot of public transport and perhaps the best way to spend your time there is to read while listening to music. Currently I use Nokia E90 and Adobe Reader LE 2.5 (full version). I was wondering if there are any better alternatives? Requirements: at least 640px wide screen, preferably 800px physical size of the LCD display matters, it should be large, but the phone itself should be as small as possible. This favors touchscreen models PDF reader should be of high quality. It should render most PDFs correctly. Other important features include: full screen mode, keyboard controls for Page Down and page change, multiple zoom levels to adjust to the screen, opening recent documents at the last page read Downsides of E90 + Adobe Reader LE Phone is large compared to the display It is hard to read the display at sunlight Adobe Reader crashes the phone regularly, zoom could have more levels, doesn't remember last page EDIT: Switched to iPhone and GoodReader. Smaller physical screen width compared to E90 is a disimprovement, but other than that I'm happy. GoodReader is the highest quality smartphone PDF reader I've seen so far.

    Read the article

  • Nginx + PHPBB3 reverse proxy images problem

    - by siberiano
    Hello all I have a problem with my Nginx Frontend + Apache2 backend + PHPBB3 software. It doesn't load the CSS and the images neither. I get constant errors like these: 2010/04/14 16:57:25 [error] 13365#0: *69 open() "/var/www/foo/styles/styles/coffee_time/theme/large.css" failed (2: No such file or directory), client: 83.44.175.237, server: www.foo.com, request: "GET /styles/coffee_time/theme/large.css HTTP/1.1", host: "www.foo.com", referrer: "http://www.foo.com/viewforum.php?f=43" This is my config of the site: server { listen 80; server_name www.foo.com; access_log /var/log/nginx/foo.access.log; # serve static files directly location ~* ^.+.(jpg|jpeg|gif|css|png|js|ico)$ { access_log off; expires 30d; root /var/www/trasteando/; } location / { root /var/www/foo/; index /var/www/foo/index.php; } # proxy the PHP scripts to predefined upstream .apache. # location ~ .php$ { proxy_pass http://apache; } location /styles/ { root /var/www/foo/styles/; }

    Read the article

  • How is the extra mSATA SSD disk used/configured in a Dell XPS laptop?

    - by Mark
    Some machines in the new XPS laptop range from Dell come with a regular, large (500GB+) HDD and an additional 32GB m-SATA SSD. The only detail I can find about this extra drive on the Dell site is this: Store your important files, multimedia and photos with XPS 15’s large hard drive options. To get instant access to your media, choose an optional mSATA solid-state drive (SSD) that can boot up to twice as fast as a regular hard drive and resumes in less than 1 second. I'd like to know more about how this extra drive is set up and used, specifically: Is anything installed on it (e.g. OS files or a boot loader) or is it just used as swap space? Is the m-SATA drive visible as a lettered drive in Windows? (I'd guess not if it's used for swap file only.) Is this unusual configuration likely to cause any problems later down the line - e.g. when upgrading to Windows 8? As usual, Dell's sales team haven't been able to help. If anyone's actually got a Dell machine with this or a similar hard drive set-up and can give a definitive answer rather than speculation I'll accept the answer.

    Read the article

  • Why does BitLocker need a minimum volume size of 64 MB?

    - by Iszi
    Since the future of TrueCrypt appears to be still unclear, I figured I'd try to get my stuff migrated into BitLocker at least for the time being. I nearly never have to access my encrypted data from anything that's not BitLocker-capable, so cross-platform compatibility isn't a big deal to me at this time. However, I am having a bit of an issue understanding the minimum requirement of a 64 MB volume. With TrueCrypt, I was able to protect small files (and most of my protected files are fairly small) in containers down to 300 KB or even less. When I finally created a VHD of an appropriate size last night (100 MB), it seemed the file system itself only took up about 3 MB and encrypting it with BitLocker didn't appear to take up any more. While 3 MB is still an order of magnitude larger than the smallest volume I could make with TrueCrypt, it's still relatively reasonable in comparison to 64 MB. This is an especially large amount of overhead (and largely wasted at that, since it's mostly empty space for now) when I consider that some of these volumes will be stored and synced in the cloud. What possible reasons could BitLocker have for needing volumes to be 64 MB large, when it's not even appearing to use that space? BitLocker FAQ on TechNet

    Read the article

  • MS Word custom dictionary making spellcheck slow - ideas?

    - by ezuk
    I have a user who edits technical materials. She uses MS Word's Custom Dictionary all the time for spelling; it has grown very large, and is now making spell check very slow. All of the advice I've read online says to disable the custom dictionary. This is an easy solution, but is not workable for the user, because she actually needs this dictionary. So, is there any way to optimize the custom dictionary and/or Word itself, so that a large dictionary file doesn't slow things down quite so badly? Many thanks. Update after suggestions: I ran contig on the file, and it reports just 1 frag, so that's not the issue I think. The file is 9.95KB -- 1,117 lines, each consisting of just a single word. I viewed the file using Notepad and none of the lines seems corrupted, strange, or overly long (no line seems to be over 10 chars or so). Both of your suggestions were helpful so I will upvote both; any further tips would be most welcome.

    Read the article

  • Deploying multiple identical copies of a virtual machine for compute tasks

    - by Reid
    I have a compute task which has a large number of library dependencies. I would like to deploy it on some of my company's large Linux clusters, where I do not have root. I could probably track down, compile, and install the right versions of all the libraries, but this looks to be quite tedious and would have to be repeated if I deployed it again somewhere else. On the other hand, it's pretty easy to install on current Ubuntu. This led me to wonder about a virtual machine approach. Could I put together a virtual machine which booted up, ran the computation (with parameters from and results to the host), and then shut down? In other words, I'd like a command like this that I could run on the host: $ ./run-vm --ram N --task /path/on/host/foo.sh --results /another/host/dir/ This would boot the VM, run foo.sh, and put the (relatively small) results of the computation in /another/host/dir/. It's important to start up many instances of the VM simultaneously, both on a single node and multiple nodes of the cluster. So it would be nice if I didn't have to make many copies of the VM virtual disk and metadata. As the task instances are completely independent, the VMs would not need any network support once deployed, or any outside communications beyond reading and writing the host filesystem. Is this possible, and if so, how might I go about doing it? Are there assumptions I've made above which are bogus?

    Read the article

  • Which default Database Systems come installed in Microsoft VS2010 Express?

    - by Tonygts
    Appreciate all advice 0n the following questions Which database systems (Ms SQL 2008, MS SQL Compact, or others) comes installed with VS2010 Express edition. SQL Server 2008 R2 Express is free, can we install and integrate with VS2010 Express? How to uninstall those database already come installed? I have installed VS2010 express on Windows 7; just VS2010 components (VB, C#, C++ and Web Developer) and without installing any other things like SQL Express. In the Console Panel-Program & Features' window, the installed list is shown below: Microsoft SQL Server 2008 Setup Support File Microsoft SQL Server 2008 Browser Microsoft SQL Server VSS Writer Microsoft SQL Server Database Publishing Wizard 1.4 Microsoft ASP.NET MVC2 - VWD Express 2010 Tools Microsoft SQL Server 2008 Management Objects Microsoft SQL Server Compact 3.5 SP2 ENU Microsoft SQL Server System CLR Types Microsoft Silverlight 3 SDK Microsoft ASP.NET MVC 2 Microsoft Visual Studio 2010 ADO.NET Entity Framework Tools Visual Studio 2010 Tools doe SQL Server Compact 3.5 SP2 ENU Web Deployment Tool Microsoft Visual Web Developer 2010 Express - ENU Microsoft Visual C++ 2010 Express - ENU Microsoft Visual C# 2010 Express - ENU Microsoft Visual Visual Basic 2010 Express - ENU Microsoft SQL Server 2008 As you can see, Microsoft SQL Server 2008 (last line) and near the top, Microsoft SQL Server Compact 3.5 SP2 ENU and many of their related SQL components such as Microsoft SQL Server 2008 R2 Management Objects are also installed. These are actually installed by installing VS2010 Express, but I have no idea how to use them or verify their valid existence from VS2010. Also, do I have to uninstall them before I install SQL Server 2008 R2, which is the latest version I believe? And what tool is needed to manage and create data source and tables?

    Read the article

  • Nginx Multiple If Statements Cause Memory Usage to Jump

    - by Justin Kulesza
    We need to block a large number of requests by IP address with nginx. The requests are proxied by a CDN, and so we cannot block with the actual client IP address (it would be the IP address of the CDN, not the actual client). So, we have $http_x_forwarded_for which contains the IP which we need to block for a given request. Similarly, we cannot use IP tables, as blocking the IP address of the proxied client will have no effect. We need to use nginx to block the requested based on the value of $http_x_forwarded_for. Initially, we tried multiple, simple if statements: http://pastie.org/5110910 However, this caused our nginx memory usage to jump considerably. We went from somewhere around a 40MB resident size to over a 200MB resident size. If we changed things up, and created one large regex that matched the necessary IP addresses, memory usage was fairly normal: http://pastie.org/5110923 Keep in mind that we're trying to block many more than 3 or 4 IP addresses... more like 50 to 100, which may be included in several (20+) nginx server configuration blocks. Thoughts? Suggestions? I'm interested both in why memory usage would spike so greatly using multiple if blocks, and also if there are any better ways to achieve our goal.

    Read the article

  • Unrelated Files Corrupted on System Restore

    - by Yar
    I restored OSX 10.6.2 today (was 10.6.3 and not booting) by copying the system over from a backup. The data directories were not touched. In the data directories, I'm seeing some files as 0 bytes, and getting permission-denied errors when copying, even when using sudo cp or the Finder itself. Some programs, differently, take the files at face value and see no permission problems (such as zip), but they see the files as zero bytes, which would be game-over for recovery. cp: .git/objects/fe/86b676974a44aa7f128a55bf27670f4a1073ca: could not copy extended attributes to /eraseme/blah/.git/objects/fe/86b676974a44aa7f128a55bf27670f4a1073ca: Operation not permitted I have tried sudo chown, sudo chmod -R 777 and sudo chflags -R nouchg which do not change the end result. Strangely, this is only affecting my .git directories (perhaps because they start with a period, but renaming them -- which works -- does not change anything). What else can I do to take ownership of these files? Edit: This question comes from StackOverflow because I originally thought it was a GIT problem. It's definitely not (just) GIT. Anyway, this is to help put some of the comments in context.

    Read the article

  • WD Caviar Green Extremely Slow

    - by Steven
    I am encountering a really weird problem on my WD Caviar Green HDD. Well first of all I have 2 HDDs on my Desktop, one 160GB Seagate holding my Win7 Ultimate x64 and the problematic one, WD 1.5 Caviar Green for storage purpose. My problem is kinda weird, when I transfer files from my Seagate(C:) to my WD (D:) the speed is good (50-60MB/s). Then the problem arises when I transfer too "many" large files, the transfer speed would go straight down to kilobytes/s. Well after I cancelled the transfer and access my D:, even entering a folder requires loading for like 10 seconds. Such problem not only arises when I am transferring files to my D:, it seems like my WD can't handle much activities. For instance, last time I installed my game on D: and I would face much lag after playing for some time. When the same game is installed on C: no problem arises. Does anyone knows what is the problem? P/S: There was one temporary solution that I used to tried. After the "situation" occurs, I tried to access as many folders on D: as I can and let it load, repeating such actions and giving it some time bring the D: back to speedy transfer. However, large transfers would causes the situation to happen again. Does it have something to do with cache whatsoever?

    Read the article

  • htaccess mod_rewrite conundrum

    - by kelton52
    Ok, so I have this .htaccess file that contains this <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php?p=%{REQUEST_URI}&%{QUERY_STRING} [L] </IfModule> Problem is, that in apache 2.2.22, the p and the other query objects don't come through, but it works great in apache 2.4.2 . So basically in apache 2.2.22 it just forwards everything to index.php, but it doesn't have any get objects. Any help, thanks. Update Ok so I changed the line RewriteRule . /index.php?p=%{REQUEST_URI}&%{QUERY_STRING} [L] to RewriteRule ^(.*) /index.php?p=$1 [L,QSA] And now on apache 2.2.22 the p GET doesn't go through, but any specific queries I add go through. So If I do http://localhost/index/fun/buns/funds?man=woman on 2.4.2 I get array (size=2) 'p' => string 'index/fun/buns/funds' (length=20) 'man' => string 'woman' (length=5) and on 2.2.22 I get array(1) { ["man"]=> string(5) "woman" } To be clear What's happening on 2.4.2 is what I want, and 2.2.22 isn't cooperating. Another Update Ok so it seems like what is happening is that when I do /index/whatever, it auto assumes index.php, and ignores that, it auto adds the .php to it, when I don't want it to. Any ideas on how to stop that?

    Read the article

  • How to script GPO in Windows computers without Active Directory?

    - by Peteris Krumins
    Does anyone know how to script GPO for users in a Windows computer that is not on any Active Directory network? I can't use GPMC because it doesn't work without a domain. I have been searching the net for the last couple of hours and all the solutions that I find are related to GPMC. I'd imagine there are some objects in the GPO that are accessible via WMI? Does anyone know anything about that? I was unable to find any information about that. Here is the situation I am trying to script: I have 10 users on the machine, and I want to restrict what they are able to do on the machine. So I created 10 GPOs for each one of them. Now I want to apply a common policy to all of them. The only way to do it is to go through each of the GPO and do it manually. This is too time consuming, therefore I am seeking for a simpler solution. I was unable to find a way to copy GPO from a user to a user. That would make it much easier, I would create a GPO for 1st user, then copy it over to all the other users, but no luck, couldn't find a way to copy GPOs. The other method I tried was creating a GPO for the whole user group but it turns out you can't apply GPO to a group unless you use GPMC, which I can't cause the computer is not on any domain. So I am thinking about scripting this whole process, but again I can't find any examples of how to access particular GPO objects for particular users and modify properties through WMI. Any suggestions on the issue I am having? Thanks!

    Read the article

  • How do you update without cutting off users?

    - by Griffin
    I searched around and I was surprised that I couldn't find an answer to this question. My assumption is that you have multiple servers. Normally they both will be doing their specific take (for the rest of this I will assume a simple website). Now lets say server A & B need updates. Do you update server A while server B keeps pushing out the webpage and then when server A is okay you update server B? This seems like it would work in small scale but seems horrible in large scale due to the fact that you'd need twice the power that you normally have. When dealing with a large number of servers do you update small sections at a time? I thought the problem with this would be if server A can't work alongside server B C D E or F any-longer that's not that bad. But when you start updating you slowly lose this small percentage. What is the proper way to deal with updates like this?

    Read the article

  • What can be done to improve time synchronization on networks with sporadic internet access?

    - by anregen
    I'm looking for advice setting up time servers for a very non-typical network. I support many closed networks that have occasional access to the internet. A network would get access most days for a few hours, but would frequently go 1-3 weeks blacked-out. The computers/servers on this network are mostly *nix-based, but not all the same flavor. The entire network is mobile, so when it connects, it will have very different hops/latency to internet time servers. The servers on the closed network are powered-off frequently (at least daily). Right now, my gut tells me to use NTP (because I hate re-learning all the stuff that someone else already got working pretty well). But I have several issues, and am looking for someone with experience in this type of strange situation. I currently have no solution in place, I'm simply letting the internal clocks drift. This results in errors of ~600s in a majority of networks. I have seen mismatch worse than 10,000s. Is there something "better" than NTP in this situation? I know NTP likes to have very frequent, consistent access to servers that give nearly identical answers. I won't have that. How many internal NTP servers should I configure, so that during periods of internet blackout, I have internal time that is consistent within the closed network? There is no human access. No matter how large the mismatch, the server(s) must attempt to correct itself. Discrete steps are very bad. No matter how large the mismatch, the correction must be "slewed", not "stepped". I understand that this could take many hours to correct.

    Read the article

  • Seagate 3TB hard drive loses format information

    - by Victor Bugarin
    I have a Windows 7x64 Ultimate, 6 GB memory, 1 TB HD. 3TB Barracuda XT HDD. The HDD is installed on a StarTech 4 bays external enclosure I had troubles so I converted to a GPT, created 1 partition and formatted as NTFS. The hard drive I can write and read to and from the hard drive but it will become unreadable at some point while I am copying files or after I have copied files to it. I have copied large Bluray movies and diverse video files, I have also copied 32 GB of pictures, and I have copied about 86 thousand music files in different formats. At some point the partition becomes unreadable and I have to format the partition again (all files lost) and I have to start the whole process again. At some point I have been unable to copy large ISO (Bluray movies) file images. I have partitioned the HDD in 2 partitions P1 - 2TB, P2 - 1TB and I have lost every single file in either partition the same way. I reformat the HDD and it seems fine. I have run seatools to check the hard drive and it reports to be OK. What gives?

    Read the article

  • Adding a transaction ID to ruby-on-rails logs

    - by Blue Warrior NFB
    We have a RoR app (rails version 3.2.15 right now). As it has been getting busier, the log-files it's producing are becoming less and less useful for troubleshooting. When they come in like this, it's not a problem: Started GET "/accounts/28088166/kittens/22894/rendered_png?file_id=5d3eaec77954a489b5ddd75143091767&kitten_store_id=9970569bbacf7b6dbeb4eb9295960d69&size=large" for 172.16.202.30 at 2013-11-12 13:45:00 +0000 Processing by KittenController#rendered_png as HTML Parameters: {"file_id"="5d3eaec77954a489b5ddd75143091767", "kitten_store_id"="9970569bbacf7b6dbeb4eb9295960d69", "size"="large", "kitten_cam_id"="280941", "id"="kjlak357aw479607t"} Rendered text template (0.0ms) Sent data (1.8ms) Completed 200 OK in 1037.4ms (Views: 1.4ms | ActiveRecord: 98.4ms) Short request, quickly assembled, all the relevant log-lines are in one block. However, not all of our code renders in 1037ms. There are a few calls that can exceed several seconds, and during that time several of these quicker ones can come in. When that happens, its very, very hard to identify which log-lines belong to which GET. Sent data (4.1ms) Completed 200 OK in 767.4ms (Views: 3.2ms | ActiveRecord: 72.2ms) Completed 200 OK in 2338.0ms (Views: 0.2ms | ActiveRecord: 0.0ms) Ooookaaaay... which goes to what? Is it possible to add something like a transaction-ID to these log-lines? The log-spam would be interspersed, but at least grep-magic would give me the unified entries that I need.

    Read the article

  • Modifying value of "Rating" column within Explorer for arbitrary file types

    - by Fake Name
    Basically, I have a large body of assorted media (text, images, flash files, archives, folders, etc...) and I'm attempting to organize it. Windows Explorer has a rating column, but there seems to be no way to modify the rating of the files short of opening them in their type-specific software (e.g. Media player, or Photo viewer). However, this does not work when the file is of an unsupported type (.rar, .swf ...), or a directory. I'd be more than willing to consider a file-manager replacement (I've alreadly looked at quite a few, Directory Opus, Total Commander, etc...), or even a solution that stores the rating metadata in a hidden file in each folder, or a separate database. The one real critical requirement is the ability to sort by rating, and being filetype-agnostic. Basically, is there any way to categorize a large collection of assorted files by rating that will work with any file type, including directories? - Ideally, there would be an easy way to add arbitrary columns to windows explorer, and edit them directly. However, there seems to be no way to do this. The rating column is the next best thing.

    Read the article

< Previous Page | 279 280 281 282 283 284 285 286 287 288 289 290  | Next Page >