Search Results

Search found 11259 results on 451 pages for 'images'.

Page 367/451 | < Previous Page | 363 364 365 366 367 368 369 370 371 372 373 374  | Next Page >

  • Configure Nginx to render static files and rewrite file extension or proxy_pass

    - by Pardoner
    I've set up Nginx to handle all my static files else proxy_pass to a Node.js server. It's working fine but I'm having difficulty rewriting the url so that it remove the .html file extension. upstream my_upstream { server 127.0.0.1:8000; keepalive 64; } server { listen 80; server_name staging.mysite.com; root /var/www/staging.mysite.org/public; access_log /var/logs/staging.mysite.org.access.log; error_log /var/logs/staging.mysite.org.error.log; location ~ ^/(images/|javascript/|css/|robots.txt|humans.txt|favicon.ico) { rewrite (.*)\.html $1 permanent; try_files $uri.html $uri/ /index.html; access_log off; expires max; } location / { proxy_redirect off; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; proxy_set_header Host $http_host; proxy_set_header X-NginX-Proxy true; proxy_set_header Connection ""; proxy_http_version 1.1; proxy_cache one; proxy_cache_key sfs$request_uri$scheme; proxy_pass http://my_upstream; } }

    Read the article

  • Looking for suggestions for hosting Windows 2000 Server in the cloud / VPS / etc?

    - by JohnyD
    I have a Windows 2000 Server, currently virtualized in Hyper-V, that I would like to get running off-site as a backup (cloud, VPS, etc). You can't virtualize in EC2 and I'm fairly certain there are no Server 2000 AMI's floating about (correct me if I'm wrong!). If anyone has a recommendation on how I can get a virtualized Windows 2000 Server running in a secure, remote environment I would be grateful. As far as locations go I'd be interested in both North America as well as Australia and Europe. In a nutshell, we're ploughing our way out of a legacy codebase and this server is the last that remains of the legacy apps. However, it is still very much used by our clients. Everything is backed up each night (data, images, etc) to tape which is then taken offsite. However, in the event of a fire I would love to have a backup legacy server to point DNS records to. So while I am rebuilding from the ashes our services would already be available. It would save a lot of time and make my managers all the more happy (and that's what it's all about, riighhtt? :D) Thank you all for your suggestions. Please let me know if I've left out any important information. Additional info: - the legacy codebase does not function properly in Server 2003

    Read the article

  • Why does my PC successfully boot only when unplugged for more than a few minutes?

    - by philg
    I have an HP Pavilion Elite desktop computer, model HPE-490t. I like it because it didn’t cost too much, boots itself from an SSD, came with 16 GB of RAM, and has 6 CPU cores for editing video and camera RAW images. It has one behavioral quirk that I cannot explain, however. The recent power interruptions here in the Northeast got the machine into a state where it could not be restarted. It would power up for a second or two, shut down, and then power up again, never being able to get to the point of showing anything on the monitor. I unplugged it for about 10 seconds and plugged it back in. Same behavior (fails to boot). I unplugged it and walked away for an hour, then plugged it back in and it worked perfectly! I think something similar happened after installing a second hard disk drive into this machine. So the question is why does the computer behave differently depending on how long it has been unplugged? Where is energy stored that affects the machine’s ability to boot? Capacitors in the power supply? Battery on the motherboard (there is one for the clock, but that wouldn’t be exhausted by being unplugged for an hour, I don’t think)?

    Read the article

  • Virus on site but can't find where

    - by Rob
    WARNING! THIS IS ABOUT A VIRUS ON MY SITE. IT APPEARS IT HAS BEEN THERE FOR SOMETIME AND I'VE HAD NO PROBLEMS. BUT PLEASE BE CAREFUL. READ EVERYTHING I SAY AND SEE IF YOU CAN HELP ME WITHOUT VISITING THE LINK. AVG PICKS UP ON IT AND BLOCKS IT, MCAFEE DOES NOT. Sorry about the warning, obviously i'm not here to get anyone infected or anything like that. Basically I run the website sortitoutsi dot net. Ages ago I got a virus on my computer, they got hold of my FTP passwords and added some lines of javascript to the top of my site. I removed them and believe it was fixed. However i'm using the "Web Developer" extension for Firefox and chose to view all javascript on my page and find there are various links to horrible urls such as: gittigidiyor-com.excite.co.jp.webmasterworld-com.eastmusicdirect.ru:8080/aboutus.org/aboutus.org/google.com/skycn.com/torrents.ru.php and gittigidiyor-com.excite.co.jp.webmasterworld-com.eastmusicdirect.ru:8080/index.php?jl= These terms do not appear anywhere. In the source code, in any of the javascript or the css. I also can't see that there are any rogue images that I don't recognise either. So i've no idea where this javascript is coming from. Can anyone suggest how I can find references to these links and remove them? I can see them both in the Web Developer firefox extension and in the net tab using Firebug. Any help would be greatly appreciated

    Read the article

  • 503 Error After Microsoft Request Routing Is Installed - 32 bit 64 bit madness

    - by KenB
    I have a requirement to install the Microsoft Request Routing component for IIS 7.5 running on a Windows 2008 R2 SP1 64Bit machine. After installing Microsoft Request Routing via the Web Platform installer our ASP.NET 4.0 application gets a "HTTP Error 503. The service is unavailable." The Windows event log error details says: The Module DLL 'C:\Program Files\IIS\Application Request Routing\requestRouter.dll' could not be loaded due to a configuration problem. The current configuration only supports loading images built for a AMD64 processor architecture. The data field contains the error number. To learn more about this issue, including how to troubleshooting this kind of processor architecture mismatch error, see http://go.microsoft.com/fwlink/?LinkId=29349. I can make this error go away by changing the application pool to run in 32 bit mode by changing the "Enable 32-Bit Applications" setting to true. However I would prefer not to have to do that to resolve the issue. My questions are: Why is the Microsoft Request Routing feature trying to load a 32 bit version, isn't there a 64 bit version for it? How do I resolve this issue without having to change my application pool to a 32 bit mode?

    Read the article

  • Gifsicle: How to set it to not overwrite the original GIF file if the resulting modified GIF file is larger than the original?

    - by galacticninja
    About Gifsicle: Gifsicle is a command-line tool for creating, editing, and getting information about GIF images and animations. One of its features is (from its website): Optimize your animations! This stores only the changed portion of each frame, and can radically shrink your GIFs. You can also use transparency to make them even smaller. Gifsicle’s optimizer is pretty powerful, and usually reduces animations to within a couple bytes of the best commercial optimizers. I call Gifsicle through this .BAT file in the Right Click - 'Send to' Menu: @echo off :compressFile "C:\Programs\Compression Scripts\gifsicle\bin\gifsicle.exe" --batch -V -O3 %1% echo. echo. SHIFT if exist %1% goto compressFile PAUSE This animated GIF file, however: http://i.minus.com/i7WdodY5Zwot3.gif, when its compression is optimized with Gifsicle with the above commands, results in a larger-filesized GIF file. Gifsicle overwrites the original GIF file with the resulting larger-filesized GIF file. Initial filesize: 7.57 MiB (7,942,886 bytes). After running through the above commands with Gifsicle: 7.64 MiB (8,017,622 bytes). Is there a way to prevent Gifsicle from overwriting the original file if its output file is larger than the original file, while still overwriting the original file if the output file is smaller? Details: OS: Windows 7 Gifsicle version: 1.63, from the binary provided here: http://www.lcdf.org/gifsicle/ Gifsicle manual

    Read the article

  • Developing high-performance and scalable zend framework website [on hold]

    - by Daniel
    We are going to develop an ads website like http://www.gumtree.com/ (it will not be like this one but just to give you an ideea) and we are having some issues regarding performance and scalability. We are planning on using Zend Framework for this project but this is all that I'm sure off at this point. I don't think a classic approch like Zend Framework (PHP) + MySQL + Memcache + jQuery (and I would throw Doctrine 2 in there to) will fix result in a high-performance application. I was thinking on making this a RESTful application (with Zend Framework) + NGINX (or maybe MongoDB) + Memcache (or eAccelerator -- I understand this will create problems with scalability on multiple servers) + jQuery or maybe throw Backbone.js in there, a CDN for static content, a server for images and a scalable server for the requests and the rest. My questions are: - What do you think about my approch? - What solutions would you recommand for developing an high performance, scalable application expected to have a lot of traffic using PHP(Zend Framework 2)...I would be interested in your approch. I should note that I'm a Zend developer, I'm working with Zend for over 3 years, this is why I'm choosing it.

    Read the article

  • What is causing ocassional white windows on my Mac?

    - by user63333
    Hello. I'm having a very strange problem with my Mac lately. When I'm working in an app and a new window pane or sheet is displayed, sometimes it comes up completely white. Once an app is having these problems, it will continue to bring up a blank screen for that particular window (although other windows work fine). After the app is relaunched, the window is fine again. What I'm noticing that's very strange is that although the interface turns completely white, the functions of the interface are still available. So I have to "navigate blindly" around the interface, until I can relaunch. This occurs throughout the operating system. Screenshots: This is what happened when I tried opening the File menu in Lightroom app. What happened to me on Lynda.com (in Firefox) after selecting the "Software..." dropdown. (All other dropdowns were fine. Reloading the page fixed it.) When I was decompressing a file, The Unarchiver launched and opened this white window. It still decompressed the file. This is what happened one time when I opened Finder (with TotalFinder) to my Downloads folder. This is something I've never seen before. This just started happening lately. What could be the problem? Thanks for your help. NOTE: since new users are not allowed to post images, just image blank white interface elements. And since new users also aren't allowed to post more than one link, here's the first screenshot:

    Read the article

  • PC dies when running at 100% CPU

    - by user155631
    I recently wrote some Java code to generate images of the Mandelbrot set (fractal). I made use of the new Fork/Join facility in Java 7 to run separate threads on all four cores (2 real, 2 virtual)simultaneously, using a large number of iterations for greater accuracy. The problem is, the process runs fine for about a minute, and then it's as if someone has pulled the plug and the PC just dies. I thought it must be the CPUs overheating, so I ran Real Temp to monitor the temperature. It's an Intel i3 processor. I can see the temperature creeping up to 70 degrees, and then it seems to level off there and run for about another 30 seconds before dying. According to Real Temp, there's still a gap of 35 degrees between the actual temperature and TJ max. I also tried disabling "CPU TM function" in the BIOS, but the problem still occurs. A colleague suggested that it might be a power supply problem, so I borrowed a more powerful PSU (can't remember what wattage it was, but it's higher than mine which is 500W). The exact same thing still happens though. Is anyone able to suggest what the problem might be, or what I can try next?

    Read the article

  • How to automate downloading files?

    - by Damon
    I got a book which had a pass to access digital versions of hi-res scans of much of the artwork in the book. Amazing! Unfortunately the presentation of all the these are 177 pages of 8 images each with links to zip files of jpgs. It is extremely tedious to browse, and I would love to be able to get all the files at once rather than sitting and clicking through each one separately. archive_bookname/index.1.htm - archive_bookname/index.177.htm each of those pages have 8 links each to the files linking to files such as <snip>/downloads/_Q6Q9265.jpg.zip, <snip>/downloads/_Q6Q7069.jpg.zip, <snip>/downloads/_Q6Q5354.jpg.zip. that don't quite go in order. I cannot get a directory listing of the parent /downloads/ folder. Also, the file is behind a login-wall, so doing a non-browser tool, might be difficult without knowing how to recreate the session info. I've looked into wget a little but I'm pretty confused and have no idea if it will help me with this. Any advice on how to tackle this? Can wget do this for me automatically?

    Read the article

  • How do I prevent my ASP .NET site from continually prompting for user credentials?

    - by gilles27
    I'm trying to get an ASP .NET website up and running on IIS6. The site will run in its own application pool, and uses Windows authentication, with anonymous access turned off. When I run the app pool under NETWORK SERVICE, everything works fine. However we need the app pool to run under a different account, because this account needs some extra privileges (we are printing Word documents). This new account is a member of the local users group, and the IIS_WPG group. It has also been granted the "Log on as a service right". When I browse to the site I am prompted for credentials, not once, but several times. When the page finally loads it looks wrong because the style sheets have not been applied. My suspicion is that I am being prompted once for each file (e.g. all the images, styles and script files) the browser requests, and that for some reason the website is unable to validate those credentials in order to serve the files back. If I allow anonymous access the page loads fine - we don't want to allow it but I mention it in case it offers any further clues. My theory is that perhaps the account the app pool runs under needs permissions to validate domain credentials? If that is so, how do I enable this?

    Read the article

  • Imagemagick convert with resample option

    - by coneybeare
    I am creating thumbnails from much larger images and have been using this command successfully for some time: convert FILE -resize "64x" -crop "64x64+0+16" +repage -strip OUTFILE I also do some other processing that is not relevant to the question. I realized that this does not adjust the resolution at all, so if I use a 300dpi image, it ends up displaying really small on some devices. I want to resample it to 72x72 so I have been trying with this command: convert FILE -resize "64x" -crop "64x64+0+16" +repage -strip -resample 72x72 OUTFILE And expected the 64x64 image at 300dpi to be resampled to a 64x64 image at 72dpi, but instead, I am getting a very funny size and density. Here is "identify" output for the original and post-processed file WITHOUT the resample: coneybeare $ convert "aa.jpg" -crop "64x64+0+16" +repage -strip "aa.png" coneybeare $ for image in `find . -type f`; do identify $image; identify -verbose $image | egrep "^ Resolution"; done ./aa.jpg JPEG 1130x1695 1130x1695+0+0 8-bit DirectClass 1.492MiB 0.000u 0:00.000 Resolution: 300x300 ./aa.png PNG 64x64 64x64+0+0 8-bit DirectClass 7.46KiB 0.000u 0:00.000 Resolution: 118.11x118.11 And here is the "identify output for the command WITH the resample: coneybeare $ convert "aa.jpg" -crop "64x64+0+16" +repage -strip -resample 72x72 "aa.png" coneybeare $ for image in `find . -type f`; do identify $image; identify -verbose $image | egrep "^ Resolution"; done ./aa.jpg JPEG 1130x1695 1130x1695+0+0 8-bit DirectClass 1.492MiB 0.000u 0:00.000 Resolution: 300x300 ./aa.png PNG 15x15 15x15+0+0 8-bit DirectClass 901b 0.000u 0:00.000 Resolution: 28.34x28.34 So, the question is: What am I doing wrong and how can I fix it so the end result is a 64x64 cropped thumbnail image at 72dpi?

    Read the article

  • Why can't I mount an image hosted on a read-only HFS+ partition via Boot Camp?

    - by deceze
    I have come across the following phenomenon and would like to know how leaky Windows' file system abstraction is or if there's something else involved. I partitioned the hard disk of my MacBook Pro and installed Windows 7 (64 bit). The Boot Camp driver package includes file system drivers that enable Windows to access the Mac OS HFS+ partition. It's read-only access, but it works. Now, I have some disk images of stuff I usually install, so I grabbed a copy of Daemon Tools to mount them. When I mount an image saved on the HFS+ partition, about two out of three installers on these disks (usually InstallShield) crash with all sorts of weird errors. Most are just gibberish that lead to all sorts of non-solutions on Google, one was "This application is not the right type for your computer, check if you need 32 or 64 bit versions." When moving the image files to another Windows 7 computer on the network and mounting them from the network share, they work fine. My question now is, why do applications behave differently depending on whether the read-only image file, which should be abstracted away through the read-only virtual Daemon Tools drive, is located on a read-only HFS+ partition or on a Windows network share? And I'll just roll this into the question as well since I was wondering: Does the file system of a network share matter? Does the client system need to understand the file system of the share host or is that abstracted away in SMB?

    Read the article

  • Firefox takes a really long time to load some sites on Ubuntu

    - by Dave
    Hello guys, I have an issue here. Some sites - just a few - takes a really long time to load on Firefox. One example is A List Apart (http://www.alistapart.com/) which takes more than 30 minutes (yes, minutes, not seconds). On Opera, ou even through a telnet session, the problematic sites run without problem, fast as expected. I am using Linux 8.04, running Firefox 3.6.3 downloaded from mozilla site, with a 10M ADSL connection. I tried many tweaks I found googling, like disable IPv6, and change http pipelining settings on FF's about:config. None worked. I also used Firebug to find what phase during negotiation is the bottleneck. Findings are in the screenshot. Well guys, any idea what is the issue? And how to solve it? I repeat, this only happens with firefox (3.6.3 and prior versions), for a few sites only (even sites with much more requests, images, javascripts, stylesheets work fine), and http pipelines and IPv6 tweaks on about:config didn't work. Thanks

    Read the article

  • How to setup apache to catch a proxy_pass from nginx?

    - by Paté
    I have a working apache vhost such as <VirtualHost localhost:10006> DocumentRoot "/home/pate/***/git/kohana_site/public/site/" </VirtualHost> <VirtualHost *:10006> ServerName api.* DocumentRoot "/home/pate/***/git/kohana_site/public/api/" LogLevel debug </VirtualHost> If i point to localhost:10006 I get my website and api.localhost:10006 I get my api. Then I have haproxy setup on top of that, that runs on port 10010 and both localhost:10010 and api.localhost:10010 have the expected behaviour. Now I have nginx setup on port 80 with this configuration. server { listen 10000; server_name api.*; location / { proxy_pass http://legacy_server; } } server { listen 10000 default; server_name _; location /nginx_status { stub_status on; access_log off; } # images are accessed via the CDN over HTTP (not https) location /n/image { proxy_pass http://image_caching_server; } location / { return 301 https://$host:10014$request_uri; } } upstream legacy_server { server localhost:10010 fail_timeout=0; } the problem is that apache does not recognize the vhost properly and redirects api.localhost to the website instead of the api. I tried playing with set_proxy_header Host $host but it doesn't seem to do anything.

    Read the article

  • Amazon EC2: Instances, IPs and a wordpress blog (LAMP)

    - by JustinXXVII
    I had a link to my blog posted on Reddit yesterday and MySQL crashed on my EC2 Micro instance. I know I didn't have that many visitors because I used a marketing link that tracks hits. The link got 167 hits over the course of the last 18 hours, and MySQL crashed twice. So anyway, 167 visits is not a lot, so I've done some short term optimizations like restricting the number of Apache threads to limit the MySQL calls. I also set up WP Super Cache to serve static content. Soon I'm going to offload all of my images to S3 or CloudFront. So this leads me to my question. If this doesn't seem to help, and if i have another traffic "spike", how do AMIs work when you have a MySQL database? I think I understand that if you have more than one instance and assign the same Elastic IP to both of them, the incoming traffic gets distributed among both. But what happens when the MySQL database gets updated on one of the instances? I just need to wrap my mind around what happens when I create an AMI and then launch a new instance to help with traffic. Thanks for your suggestions.

    Read the article

  • Enlarging everything on 16" 1920x1080 notebook display in Windows 7

    - by Rob
    Does Windows 7 have an option to enlarge everything on the screen, e.g. via a spi setting? How well does this work? i.e. do the objects look clear when enlarged? I ask this because sometimes software that enlarges non-photographic bitmap images e.g. icons and symbols can leave them artifacted with harsh jaggier slopes and blurred lines. I've tried the dpi setting in Windows XP but it doesn't enlarge everything, and some things are not so clear as described above. I'm looking at a notebook/laptop with this spec. I've already enjoyed using a 15.4" 1920x1200 display for 5 years. I've tried the dpi setting in Windows XP but it doesn't enlarge everything, and some things are not so clear as described above. I am buying a laptop for my father who will probably prefer larger objects on the screen, although I want to provide some future proofing by allowing more on the screen if needed. I'm not interested in answers that debate the effectiveness or otherwise merits of 1920x1080 on a 16" display, please. The alternative option of 1366x768 seems too little.

    Read the article

  • Attempting to emulate Apache MultiViews with Nginx try_files

    - by Samuel Bierwagen
    I want a request to http://example.com/foobar to return http://example.com/foobar.jpg. (Or .gif, .html, .whatever) This is trivial to do with Apache MultiViews, and it seems like it would be equally easy in Nginx. This question seems to imply that it'd be easy as try_files $uri $uri/ index.php; in the location block, but that doesn't work. try_files $uri $uri/ =404; doesn't work, nor does try_files $uri =404; or try_files $uri.* =404; Moving it between my location / { block and the regexp which matches images has no effect. Crucially, try_files $uri.jpg =404; does work, but only for .jpg files, and it throws a configuration error if I use more than one try_files rule in a location block! The current server { block: server { listen 80; server_name example.org www.example.org; access_log /var/log/nginx/vhosts.access.log; root /srv/www/vhosts/example; location / { root /srv/www/vhosts/example; } location ~* \.(?:ico|css|js|gif|jpe?g|es|png)$ { expires max; add_header Cache-Control public; try_files $uri =404; } } Nginx version is 1.1.14.

    Read the article

  • Change default profile directory per group

    - by Joel Coel
    Is it possible to force windows to create profiles for members of one active directory group in a different folder from members in another active directory group? The school here uses DeepFreeze to protect public computers. In a nutshell, DeepFreeze prevents all changes to a hard drive such that every time you restart the machine the disk is identical to it was at the time you froze it. This is a bit different than restoring to an image, in that it never really wrote changes to disk in a permanent way in the first place. This has a few advantages over images: faster recover times, and it's easy to thaw the machine for a few minutes to perform maintenance such as windows updates (which can even be automated). DeepFreeze also allows you to configure a "thawspace" partition, where changes are persistent across reboots. One of the weaknesses of DeepFreeze is that you end up needing to create a new profile every time you log in, unless your profile existed at the time the machine was frozen. And even then, any changes you make to your profile while working on a frozen machine are lost. As students have frequent legitimate needs to log in to our classroom machines, there is currently a lot of cleanup involved from time to time in removing their old profiles and changes, so I want to extend DeepFreeze to protect our classroom computers as well as public computers. The problem is that faculty have a real need to keep a stateful profile locally on these classroom computers. The solution I would like to use is to configure Windows via group policy (or even manually, if that's the way I'll have to do it) to place profile folders on the thawspace partition, but only for members of the faculty security group. Is this possible?

    Read the article

  • Windows 7 System File Checker (sfc) not working

    - by Andrew
    I have a strange problem with Windows 7 RTM. I'm running Ultimate 64-bit edition. Whenever I run sfc /scannow I get this error: Windows Resource Protection could not start the repair service. My research so far has told me that I need to set the Windows Module Installer and Windows Installer services' startup types to Manual. They already were, so I manually started those services and tried again. No luck. I've even booted into my Windows 7 repair disc and tried running sfc /scannow from that. All I get is this: A repair operation is already pending. Restart and try again. History: I'm trying to run sfc because I am unable to open any images in Windows Photo Viewer. Whenever I try, I get the error "Class not registered." I believe the problem started after I installed Gladinet, but I can't be sure. I've uninstalled Gladinet, but the problem remains. System Restore was disabled (yes, I know I'm stupid - you don't have to remind me). Please help. Thanks.

    Read the article

  • Dell laptop keyboard doesn't work

    - by Tam
    I'm trying to fix my in-laws laptop, it's a Dell Studio 1745 that's running Windows 7 64 bit. The problem is that most of the keys on the keyboard do not work. The function keys work and the caps lock and numpad keys work, but no other keys do. If I hit the F2 key enough times when starting up, I can get to the BIOS, but after that even the function keys stop working. If I let it go all the way to the Windows login screen, I can see that the caps lock and num lock work - little images on screen actually appear, but they don't toggle the state of the key, i.e.,capslock is always off, numlock is always off. Using the fn+function combo works, so changing the brightness, etc. works fine. I'm stumped. I've tried disconnecting power and battery and leaving it for an hour or so before starting up but that hasn't helped either. Also - this might be a red herring - the touchpad is failing as well, the MS Device Manager says that it's failing with status 10, "unable to start device"

    Read the article

  • "What happens?" server performance monitor

    - by AlexAtNet
    Hello! After reviewing some thread about server monitoring software I end up with a simple question: Which of the server monitoring tools should I use for automatic detection of "abnormal" situations with recommendations on how to fix them? I look for software that checks the system performance after installation and calculate some average load values (memory, CPU, etc). And when something happens (CPU load is increased to 20%) then it tries to detect a reason for this. If it is apache, it should check for access logs. If mysql, it should check mysql logs and tell me what happens. It this is because some user decodes a lot of images, I'd like to know which command is executed, when and user name. The same for disk usage, memory, number of processes, threads and so on. Ideally, this software should periodically checks the system and report problems: errors in PHP error log, outdated packages, security vulnerabilities. In other word I'm looking a software that will keep my simple Debian/Apache/PHP/MySQL server without forcing me to monitor the charts every day. I hope that such program exists. Thanks, Alex

    Read the article

  • What's the best way to clone multiple PCs from one machine?

    - by Jason T.
    Where I work we have dozens and dozens of old ThinkPad laptops. A lot of these can be reused but not for our needs. They have been long since replaced. The higher-ups have decided to donate them to charity. For better or for worse I have been tasked with reimaging them. I took a laptop and installed the factory copy of Windows, updated it, configured it appropriately. Now I'm trying to reimage it to dozens of other laptops. What's some good software to do this? First I used clonezilla to clone the hdd in the laptop to an internal drive in an external enclosure and it worked. Then I tried taking the base image out and connecting it externally to a laptop that needed to be imaged and I got it to work a few times. So far so good, right? Well once I informed my boss of my findings and what I would want to do then the images started to not work on new laptops. One of three things would happen: The Thinkpads would just blink at me and Windows wouldn't load. Or Windows would load but freeze within two minutes. Last but not least the laptops would BSOD during the Windows XP bootup. These laptops are not going to be used by the company. They're going to charity. So can anyone else recommend a way to reimage multiple laptops?

    Read the article

  • Restrict Computer or Users from Internet but allow access to intranet and Windows Update / ePO?

    - by MoSiAc
    So this may be impossible but I've been asked to try and find something about it. So far nothing I have found is possible. I need to restrict specific machines or user accounts from regular Internet access but let them have access to the intranet portion of our network. I do not have Active Directory control, nor does anyone at my local workplace (corporate control in a different state). I have tried going through IPsec and doing this per local machine, but that system seems to have been removed from the images that are installed on these machines so that is out. So far the only other option I can think of is assigning the machines a specific ip address and removing their gateway access. This would probably work but the machines need to be able to receive updates that are being pushed to them through ePO and LanDesk. I would really like to do this on the user level because then if I need to do tech work to the machine and need internet access I can get to it but a "special" user could login and not be able to get into anything.

    Read the article

  • Why dd finishes instantly when pipelining to cat?

    - by agsamek
    I start bash on Cygwin and type: dd if=/dev/zero | cat /dev/null It finishes instantly. When I type: dd if=/dev/zero > /dev/null it runs as expected and I can issue killall -USR1 dd to see the progress. Why does the former invocation finishes instantly? Is it the same on a Linux box? * Explanation why I asked such stupid question and possibly not so stupid question I was compressing hdd images and some were compressed incorrectly. I ended up with the following script showing the problem: while sleep 1 ; do killall -v -USR1 dd ; done & dd if=/dev/zero bs=5000000 count=200 | gzip -c | gzip -cd | wc -c Wc should write 1000000000 at the end. The problem is that it does not on my machine: bash-3.2$ dd if=/dev/zero bs=5000000 count=200 | gzip -c | gzip -cd | wc -c 13+0 records in 12+0 records out 60000000 bytes (60 MB) copied, 0.834 s, 71.9 MB/s 27+0 records in 26+0 records out 130000000 bytes (130 MB) copied, 1.822 s, 71.4 MB/s 200+0 records in 200+0 records out 1000000000 bytes (1.0 GB) copied, 13.231 s, 75.6 MB/s 1005856128 Is it a bug or am I doing something wrong once again?

    Read the article

< Previous Page | 363 364 365 366 367 368 369 370 371 372 373 374  | Next Page >