Search Results

Search found 82005 results on 3281 pages for 'cost based data structure'.

Page 1475/3281 | < Previous Page | 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482  | Next Page >

  • Command-line HTTP crawler for Windows?

    - by Pekka
    Would somebody have a recommendation for a web site crawler that can be invoked and equipped with settings from the command line? This would need to run in a Windows environment. Saving the data, following stylesheet links etc. is not an issue. I only need the crawler to start with a page, parse it, and follow all the links on the same domain so that in the end, all pages on the site have been requested once. Background: I'm setting up a web site that gets frequently uploaded from an office location. Combining data from various sources, it has several levels of caching. I don't want the first user to visit the site after a fresh upload to have to wait until the page has been generated and saved in the cache.

    Read the article

  • Would a Socket Connection Outperform an Intarvaled Database Sweep and Requests?

    - by Jascha
    I'm building a small chat application to add to an existing framework. There will only be 20-50 users MAX at any one time. I was wondering if I could get away with updating a cache file containing (semi) live chat data for whichever users happen to be chatting just by performing timed queries and regular AJAX refreshes for new data as opposed to learning how to open and maintain a socket connection. I'm sure there are existing chat plug-ins out there. But I just had a hell of a time installing one and I could see building the whole damn thing taking just as much time as plugging one in. Am I off to a bad start? Thanks in advance -J (p.s. this is a semi closed network behind a php login so security isn't a great concern)

    Read the article

  • Need some help with Apache .htaccess

    - by Legend
    I am trying to setup an application that was built using the Zend framework. Let's say my subdomain is: http://subdomain.domain.com and that it points to the following: http://www.domain.com/projectdir/ The structure of the project dir is the following: application/ ... ... library/ ... ... public/ ... ... .htaccess The contents of the htaccess are: SetEnv APPLICATION_ENV production RewriteEngine On # skip existing files and folders RewriteCond %{REQUEST_FILENAME} -s [OR] RewriteCond %{REQUEST_FILENAME} -l [OR] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^.*$ - [NC,L] # send everything to index RewriteRule ^.*$ index.php [NC,L] While this works, the child objects on the page are being directed to the domain i.e., the image URLs (and the CSS files etc.) are broken because they are being redirected to something like: http://www.domain.com/images/image.png Can someone please tell me how to fix this?

    Read the article

  • DFS-R + VSS - Run VSS at all locations or one?

    - by pbarranis
    I have 4 servers, one at each office location, each sharing read/write copies of 1.5TB of data. Changes are replicated 24/7 between all 4 servers via DFS-R. All servers run Windows 2008 R2. I'm interested in implementing VSS (Volume Shadow Services) on this data. I've read that DFS-R and VSS play nicely together, but I'm left with one unanswered question: turn on VSS at all locations or just one (the headquarters). Can I run VSS at all 4 locations safely, or is it wiser to run it at just 1 location? Thanks!

    Read the article

  • Log rotation with automatic *.log file discovery

    - by Mikko Ohtamaa
    I am hosting several websites which each of run their own Python process and write *.log output files, but the directory structure is not standardized. Example: -rw-r--r-- 1 plone plone 125M 2012-08-29 11:35 ./x/var/log/instance-Z2.log -rw-r--r-- 1 plone plone 19M 2012-08-29 00:07 ./zope2.9/y/log/event.log -rw-r--r-- 1 plone plone 188M 2012-08-13 00:09 ./zope2.9/y/log/Z2.log -rw-r--r-- 1 plone plone 137M 2010-11-16 09:41 ./zope2.9/y/log/event.log I'd like to make log rotate autodiscovery these log files and run a log rotation on them, as opposite to manually type in every log file to logrotate conf. Does any existing tools offer this kind of log file discovery and rotation capabilities, without manually specifying each file? If not... then just write a shell script which generates the logrotate conf?

    Read the article

  • Linux - Create ftp account with read/write access to only 1 folder

    - by Gublooo
    Hey guys.... I have never worked on linux and dont plan on working on it either - The only command I probably know is "ls" :) I am hosting my website on Eapps and use their cpanel to setup everything so never worked with linux. Now I have this one time case - where I need to provide access to a contractor to fix the CSS issues on my website. He basically needs FTP (read/write) access to certain folders. At a high level - this is my code structure /home/webadmin/example.com/html/images /css /js /login.php /facebook.php /home/webadmin/example.com/application/library /views /models /controllers /config /bootstrap.php /home/webadmin/example.com/cgi-bin I want the new user to be able to have access to only these folders /home/webadmin/example.com/html/js /home/webadmin/example.com/html/css /home/webadmin/example.com/application/views He should not be able to view even the content of other folders including files like bootstrap.php or login.php etc If any sys admins can help me set this account up - will really appreciate it. Thanks

    Read the article

  • Linux - Create ftp account with read/write access to only 1 folder

    - by Gublooo
    Hey guys.... I have never worked on linux and dont plan on working on it either - The only command I probably know is "ls" :) I am hosting my website on Eapps and use their cpanel to setup everything so never worked with linux. Now I have this one time case - where I need to provide access to a contractor to fix the CSS issues on my website. He basically needs FTP (read/write) access to certain folders. At a high level - this is my code structure /home/webadmin/example.com/html/images /css /js /login.php /facebook.php /home/webadmin/example.com/application/library /views /models /controllers /config /bootstrap.php /home/webadmin/example.com/cgi-bin I want the new user to be able to have access to only these folders /home/webadmin/example.com/html/js /home/webadmin/example.com/html/css /home/webadmin/example.com/application/views He should not be able to view even the content of other folders including files like bootstrap.php or login.php etc If any sys admins can help me set this account up - will really appreciate it. Thanks

    Read the article

  • Linux - Create ftp account with read/write access to only 1 folder

    - by Gublooo
    Hey guys.... I have never worked on linux and dont plan on working on it either - The only command I probably know is "ls" :) I am hosting my website on Eapps and use their cpanel to setup everything so never worked with linux. Now I have this one time case - where I need to provide access to a contractor to fix the CSS issues on my website. He basically needs FTP (read/write) access to certain folders. At a high level - this is my code structure /home/webadmin/example.com/html/images /css /js /login.php /facebook.php /home/webadmin/example.com/application/library /views /models /controllers /config /bootstrap.php /home/webadmin/example.com/cgi-bin I want the new user to be able to have access to only these folders /home/webadmin/example.com/html/js /home/webadmin/example.com/html/css /home/webadmin/example.com/application/views He should not be able to view even the content of other folders including files like bootstrap.php or login.php etc If any sys admins can help me set this account up - will really appreciate it. Thanks

    Read the article

  • The ping response time doesn't reflect the real network response time

    - by yangchenyun
    I encountered a weird problem that the response time returned by ping is almost fixed at 98ms. Either I ping the gateway, or I ping a local host or a internet host. The response time is always around 98ms although the actual delay is obvious. However, the reverse ping (from a local machine to this host) works properly. The following is my route table and the result: route -n Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface 0.0.0.0 192.168.1.1 0.0.0.0 UG 100 0 0 eth1 60.194.136.0 0.0.0.0 255.255.255.0 U 0 0 0 eth0 169.254.0.0 0.0.0.0 255.255.0.0 U 1000 0 0 eth1 192.168.1.0 0.0.0.0 255.255.255.0 U 0 0 0 eth1 # ping the gateway ping 192.168.1.1 PING 192.168.1.1 (192.168.1.1) 56(84) bytes of data. 64 bytes from 192.168.1.1: icmp_req=1 ttl=64 time=98.7 ms 64 bytes from 192.168.1.1: icmp_req=2 ttl=64 time=97.0 ms 64 bytes from 192.168.1.1: icmp_req=3 ttl=64 time=96.0 ms 64 bytes from 192.168.1.1: icmp_req=4 ttl=64 time=94.9 ms 64 bytes from 192.168.1.1: icmp_req=5 ttl=64 time=94.0 ms ^C --- 192.168.1.1 ping statistics --- 5 packets transmitted, 5 received, 0% packet loss, time 4004ms rtt min/avg/max/mdev = 94.030/96.149/98.744/1.673 ms #ping a local machine ping 192.168.1.88 PING 192.168.1.88 (192.168.1.88) 56(84) bytes of data. 64 bytes from 192.168.1.88: icmp_req=1 ttl=64 time=98.7 ms 64 bytes from 192.168.1.88: icmp_req=2 ttl=64 time=96.9 ms 64 bytes from 192.168.1.88: icmp_req=3 ttl=64 time=96.0 ms 64 bytes from 192.168.1.88: icmp_req=4 ttl=64 time=95.0 ms ^C --- 192.168.1.88 ping statistics --- 4 packets transmitted, 4 received, 0% packet loss, time 3003ms rtt min/avg/max/mdev = 95.003/96.696/98.786/1.428 ms #ping a internet host ping google.com PING google.com (74.125.128.139) 56(84) bytes of data. 64 bytes from hg-in-f139.1e100.net (74.125.128.139): icmp_req=1 ttl=42 time=99.8 ms 64 bytes from hg-in-f139.1e100.net (74.125.128.139): icmp_req=2 ttl=42 time=99.9 ms 64 bytes from hg-in-f139.1e100.net (74.125.128.139): icmp_req=3 ttl=42 time=99.9 ms 64 bytes from hg-in-f139.1e100.net (74.125.128.139): icmp_req=4 ttl=42 time=99.9 ms ^C64 bytes from hg-in-f139.1e100.net (74.125.128.139): icmp_req=5 ttl=42 time=99.9 ms --- google.com ping statistics --- 5 packets transmitted, 5 received, 0% packet loss, time 32799ms rtt min/avg/max/mdev = 99.862/99.925/99.944/0.284 ms I am running iperf to test the bandwidth, the rate is quite low for a LAN connection. iperf -c 192.168.1.87 -t 50 -i 10 -f M ------------------------------------------------------------ Client connecting to 192.168.1.87, TCP port 5001 TCP window size: 0.06 MByte (default) ------------------------------------------------------------ [ 4] local 192.168.1.139 port 54697 connected with 192.168.1.87 port 5001 [ ID] Interval Transfer Bandwidth [ 4] 0.0-10.0 sec 6.12 MBytes 0.61 MBytes/sec [ 4] 10.0-20.0 sec 6.38 MBytes 0.64 MBytes/sec [ 4] 20.0-30.0 sec 6.38 MBytes 0.64 MBytes/sec [ 4] 30.0-40.0 sec 6.25 MBytes 0.62 MBytes/sec [ 4] 40.0-50.0 sec 6.38 MBytes 0.64 MBytes/sec [ 4] 0.0-50.1 sec 31.6 MBytes 0.63 MBytes/sec

    Read the article

  • Close format tags

    - by ZyX
    I have HTML with following structure <html> <...> <div> Something </div> <...> Text </html> If «Something» contains an unclosed <b> tag «Text» also becomes bold. I want to close this tag using [User]CSS, [User]JS, HTML or fixed number of PCRE regexs. I do not know how many unclosed tags Something contains, but div's around Something are added by me using Privoxy.

    Read the article

  • MSDCS zone missing

    - by hyp
    I seem to have an issue with our AD/DNS, the structure looks like: but the BPA gives me an error: Issue: The Active Directory integrated DNS zone _msdcs.<domain> was not found. Impact: DNS queries for the Active Directory integrated zone _msdcs.<domain> might fail. Resolution: Restore the Active Directory integrated DNS zone _msdcs.<domain> Now we've got 4 DC's in total: 2 running Server 2008 R2, 2 running Server 2003. The older ones will be retired sometime this year. Actually everything seems to be working ok (if something isn't then we don't know about it), we've got quite a few .NET applications authenticating against AD, no DNS issues from what I can tell and various bits on the network point to all 4 controllers. Furthermore a dcdiag /dnsall comes up with all passes. Is this something I should be worried about?

    Read the article

  • Write to windows share

    - by aidan
    I used to mount a windows share in Ubuntu server, with an entry in fstab: //data/SharedFolder /media/SharedFolder/ smbfs user,defaults,credentials=/root/.creds,uid=root,gid=root 0 0 /root/.creds is a text file with three lines, my username, password and domain. Users on the ubuntu server could write to this mount, but then I upgraded to 10.04 and now only root can write. Regular users can still read though. mount currently tells me: //data/SharedFolder on /media/SharedFolder type cifs (rw,mand,noexec,nosuid,nodev) How do I make it world writeable again? Thanks

    Read the article

  • firefox sync not working. tried everything I could think of.

    - by RasmusWriedtLarsen
    I had to format my computer once again. But this time when trying to get all my firefox data from the sync server, nothing happened :( I've tried updating firefox. reconnecting to the sync server a couple of times. But as the title says, nothing has happened. I fear that somehow all my data has been whiped, but I have no clue why it should have done that. I think my old FF was running v4b7, but I can't say for sure. Please help! I also tried posting on the firefox forums. But I must say they don't work very well. It doesn't seem like my question even have been posted.

    Read the article

  • aptitude: list all previous recommended packages

    - by casper
    sometimes when installing a package, aptitude recommends several other packages. Is there a way to show all previous recommended packages of all installed packages? Thanks in advance. Casper Edit: Thanks for the replys so far. I already tried: aptitude show ~i | grep '^Recommends' | cut -d ' ' -f 2- Thats mostly ok. But it gives also things back like: console-setup | console-data (>= 2002.12.04dbs-1) I want an easy way, to install all missing recommended packages. So aptitude install console-setup | console-data (>= 2002.12.04dbs-1) won't work ;-) Is there a way, without manual checking all entries, to do this?

    Read the article

  • Help me exorcise my demon possessed logon script

    - by Detritus Maximus
    I have a user logon script that copies a file over to a subfolder of the current user's profile path: Script (only showing the line that isn't working): copy /Y c:\records\javasettings_Windows_x86.xml "%USERPROFILE%\Application Data\OpenOffice.org\3\user\config">>c:\records\OOo3%USERNAME%.txt 2>&1 To diagnose why it wasn't working, I did a somelogfile.log parameter on the group policy script and found that what the above command is translating to is this: C:\WINDOWS>copy /Y c:\records\javasettings_Windows_x86.xml "C:\Documents and Settings\test2\Application Data\OpenOffice.org\3\user\config" 1>>c:\records\OOo3test2.txt 2>&1 So the question is, how do I get rid of (exorcise) the " 1" in that line? Update 1: So the reason the script wasn't working was that the creator didn't have any permissions on the directory. I fixed the permissions, and now the file works but! I still have the " 1" showing on all the logs and would like to know why.

    Read the article

  • How to FTP via CLI? [closed]

    - by Ryan
    So I have a Debian machine and I want to transfer data on here to a Windows based FTP account. I've managed to open up firewalling for outbound FTP on the Debian source, but have never used FTP via CLI before. Does anyone know how I can go about transferring data? I am starting to get really confused with what to transfer from where. If I ssh to the Debian machine and then connect to the Windows FTP account, then try put and get commands, it never seems to recognise the path of the source files on the Debian machine.

    Read the article

  • Stability, x86 Vs Sparc

    - by Jason T
    Our project are plan to migrate from Sparc to x86, and our HA requirement is 99.99%, previous on Sparc, we assume the hardware stability would like, hardware failure every 4 month or even one year, and also we have test data for our application, then we have requirement for each unplanned recovery (fail over) to achieve 99.99% (52.6 minutes unplanned downtime per year). But since we are going to use Intel x86, it seems the hardware stability is not so good as Sparc, but we don't have the detail data. So compare with Sparc, how about the stability of the Intel x86, should we assume we have more unplanned downtime? If so, how many, double? Where I can find some more detail of this two type of hardware?

    Read the article

  • SQL query duplicating results [on hold]

    - by Ben
    I have written a query that results in data being retrieved for the top 5 customers in my table per account manager. Here is the query: SELECT account_manager_id, mgap_ska_id, total FROM (SELECT account_manager_id, mgap_ska_id, mgap_growth + mgap_recovery AS total, @grp_rank := IF(@current_accmanid = account_manager_id, @grp_rank + 1, 1) AS grp_rank, @current_accmanid := account_manager_id FROM mgap_orders ORDER BY total DESC ) ranked WHERE grp_rank <= 5 and here is the result of the query: account_manager_id mgap_ska_id total 159840 5062352 61569.21 159840 5062352 61569.21 159840 5062352 61569.21 159840 5062352 61569.21 159840 5062352 61569.21 160023 5024546 52244.29 160023 5024546 52244.29 160023 5024546 52244.29 160023 5024546 52244.29 160023 5024546 52244.29 159669 5323292 50126.38 159669 5323292 50126.38 159669 5323292 50126.38 159669 5323292 50126.38 159669 5323292 50126.38 As you can see the query is partially working as needed, except Im getting duplicates for mgap_ska_id whereas it should be five individual mgap_ska_id numbers. and here is a sample of my data: mgap_ska_id account_manager_id mgap_growth mgap_recovery 5057810 64154 0 1160.78 5178114 24456 0 5773.42 5292421 160338 0 5146.04 5414091 24408 0 104.14 5057810 64154 0 1160.78 Can anyone see where Ive gone wrong in my query and how/where I might correct the error so I get the 5 top individual customers (mgap_ska_id) instead of the duplicated top single customer?

    Read the article

  • Using wget to download pdf files from a site that requires cookies to be set

    - by matt74tm
    I want to access a newspaper site and then download their epaper copies (in PDF). The site requires me to login using my email address and password and then it permits me to access those PDF URLs. I'm having trouble 'setting my session' in wget. When I login into the site from my browser, it sets two cookie values: [email protected] Password=12345 I tried: wget --post-data "[email protected]&Password=12345" http://epaper.abc.com/login.aspx However, that just downloaded the login page and saved it locally The FORM on the login page has two fields: txtUserID txtPassword and radiobuttons like this: <input id="rbtnManchester" type="radio" checked="checked" name="txtpub" value="44"> Another button: <input id="rbtnLondon" type="radio" name="txtpub" value="64"> If I post this to the login.aspx page, I get the same output wget --post-data "[email protected]&txtPassword=12345&txtpub=44" http://epaper.abc.com/login.aspx If I do: --save-cookies abc_cookies.txt it doesnt seem to have anything other than the default content. For the last if I do --debug as well it says: ... Set-Cookie: ASP.NET_SessionId=05kphcn4hjmblq45qgnjoe41; path=/; HttpOnly ... Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId 05kphcn4hjmblq45qgnjoe41 Length: 107253 (105K) [text/html] Saving to: `login.aspx' ... Saving cookies to abc_cookies.txt. However, abc_cookies.txt shows ONLY the following: # HTTP cookie file. # Generated by Wget on 2011-08-16 08:03:05. # Edit at your own risk. (Not sure why I'm not getting any responses on SO - perhaps SU is a better forum - http://stackoverflow.com/questions/7064171/using-wget-to-download-pdf-files-from-a-site-that-requires-cookies-to-be-set) EDIT 1 C:\Temp>wget --cookies=on --keep-session-cookies --save-cookies abc_cookies.txt --post-data "txtUserID=abc%40gmail.com&txtPassword=password&txtpub=44&chkbox=checkbox&submit.x=48&submit.y=7" http://epaper.abc.com/login.aspx --debug SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc DEBUG output created by Wget 1.11.4 on Windows-MinGW. --2011-08-18 08:15:59-- http://epaper.abc.com/login.aspx Resolving epaper.abc.com... seconds 0.00, 999.999.99.99 Caching epaper.abc.com => 999.999.99.99 Connecting to epaper.abc.com|999.999.99.99|:80... seconds 0.00, connected. Created socket 300. Releasing 0x00a2ae80 (new refcount 1). ---request begin--- POST /login.aspx HTTP/1.0 User-Agent: Wget/1.11.4 Accept: */* Host: epaper.abc.com Connection: Keep-Alive Content-Type: application/x-www-form-urlencoded Content-Length: 100 ---request end--- [POST data: txtUserID=abc%40gmail.com&txtPassword=password&txtpub=44&chkbox=checkbox&submit.x=48&submit.y=7] HTTP request sent, awaiting response... ---response begin--- HTTP/1.1 200 OK Connection: keep-alive Date: Thu, 18 Aug 2011 02:46:17 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET X-AspNet-Version: 2.0.50727 Set-Cookie: ASP.NET_SessionId=owcrje55yl45kgmhn43gq145; path=/; HttpOnly Cache-Control: private Content-Type: text/html; charset=utf-8 Content-Length: 107253 ---response end--- 200 OK Registered socket 300 for persistent reuse. Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId owcrje55yl45kgmhn43gq145 Length: 107253 (105K) [text/html] Saving to: `login.aspx.1' 100%[======================================================================================================================>] 107,253 24.9K/s in 4.2s 2011-08-18 08:16:05 (24.9 KB/s) - `login.aspx.1' saved [107253/107253] Saving cookies to abc_cookies.txt. Done saving cookies. C:\Temp>wget --referer=http://epaper.abc.com/login.aspx --cookies=on --load-cookies abc_cookies.txt --keep-session-cookies --save-cookies abc_cookies.txt http://epaper.abc.com/PagePrint/16_08_2011_001.pdf --debug SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc DEBUG output created by Wget 1.11.4 on Windows-MinGW. Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId owcrje55yl45kgmhn43gq145 --2011-08-18 08:16:12-- http://epaper.abc.com/PagePrint/16_08_2011_001.pdf Resolving epaper.abc.com... seconds 0.00, 999.999.99.99 Caching epaper.abc.com => 999.999.99.99 Connecting to epaper.abc.com|999.999.99.99|:80... seconds 0.00, connected. Created socket 300. Releasing 0x00598290 (new refcount 1). ---request begin--- GET /PagePrint/16_08_2011_001.pdf HTTP/1.0 Referer: http://epaper.abc.com/login.aspx User-Agent: Wget/1.11.4 Accept: */* Host: epaper.abc.com Connection: Keep-Alive Cookie: ASP.NET_SessionId=owcrje55yl45kgmhn43gq145 ---request end--- HTTP request sent, awaiting response... ---response begin--- HTTP/1.1 200 OK Connection: keep-alive Date: Thu, 18 Aug 2011 02:46:30 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET X-AspNet-Version: 2.0.50727 content-disposition: attachement; filename=Default_logo.gif Cache-Control: private Content-Type: image/GIF Content-Length: 4568 ---response end--- 200 OK Registered socket 300 for persistent reuse. Length: 4568 (4.5K) [image/GIF] Saving to: `16_08_2011_001.pdf' 100%[======================================================================================================================>] 4,568 7.74K/s in 0.6s 2011-08-18 08:16:14 (7.74 KB/s) - `16_08_2011_001.pdf' saved [4568/4568] Saving cookies to abc_cookies.txt. Done saving cookies. Contents of abc_cookies.txt epaper.abc.com FALSE / FALSE 0 ASP.NET_SessionId owcrje55yl45kgmhn43gq145

    Read the article

  • Could I centralize batch files more efficiently?

    - by PeanutsMonkey
    I am new to the world of batch scripting so please forgive what may appear as basic questions. I am learning as I get assigned different jobs and I am a huge proponent of automation where possible. I have several batch files that perform several tasks. Each of these files had their paths hard-coded e.g. c:\temp. d:\data, etc in the batch file. Initially I moved these to a text file I could call from a batch file e.g. for /f "tokens=1,2 delims==" %%R in (config.txt) do ( if %%R==bdata set bdata=%%S if %%R==cdata set cdata=%%S ) The config.txt file contains these values bdata=c:\temp cdata=d:\data I realized that each time I would need to create a new variable, I would need to update the config.txt file as well the config.bat files. I decided I would move all the values to just the config.bat file as follows set bdata=c:\temp set cdata=d:\data I then updated each of the existing batch files to call the variables rather than the hard-coded paths. I also added the following lines of code to each batch file except config.bat. The only additional line added to the config.bat file is @echo off. @echo off setlocal enableextensions enabledelayedexpansion call config.bat I then have another batch file that centralizes calling all the batch files in sequence. The name of this batch file is start.bat. The reason I am using start /wait is because there have been instances of where the delete.bat runs before compress.bat has had an opportunity to finish. start /wait compress.bat start /wait validate.bat start /wait delete.bat Questions Is this the best way to centralize values and if not, what is a better way? Do I need to specify setlocal enableextensions enabledelayedexpansion in all the existing batch files? Do all the batch files have to have @echo off or is it sufficient for just the config.bat file? Is start /wait the best way to call multiple files? Can I pass values from one batch file to another using the said command? All the batch files have different functions e.g. move, delete, etc however use %%a or %%b. Is this okay? For example The validate.bat file has the code for %%a in (%bdata%\*.*) do if "%%~xa" == "" move /Y "%bdata%\%%~xa" "%bdata%\%done%" and the delete.bat file has the code for %%a in (%bdata%\*.*) do if "%%~xa" == ".txt" del "%%a"

    Read the article

  • How would I put together a site requiring several TB? [closed]

    - by acidzombie24
    Lets say I have a site with unmetered 100MBPS bandwidth (i assume its bits?) and the ram i require. Most plans i see offer HDD that hold 250gb and 1TB. But what happens if i compile/generate enough data that i require 10tb or 25tb? (I'd likely have two servers but...) I wouldn't be serving all of that data (well not to the public) so CDN wouldn't make sense. What do i do in this scenario? Do I need to get a custom plan from a hosting provider? (if so how do i find them?) Are there services that allow me to mount remote drives (that sounds wrong unless its a CDN so maybe not). Are there host that deals specifically with unmetered bandwidth and provides lots of disk space? Math says ~1TB is the most i'll ever need but if i happen to need more i'd like to know my options.

    Read the article

  • Increase backup speed, Backup Exec 2010 - QNAP TS419U+ NAS

    - by user99912
    We have a QNAP NAS and the network shares are being backed up by Backup Exec 2010 over SMB. We can't install the remote agent on the NAS as it has an ARM processor and, as far as I am aware, there is no compatible agent. Do you have any suggestions on any faster method of backing up these shares as opposed to the current scenario? Currently the network bandwidth is not the issue, it seems that this access method is just not able to go any quicker. We've also added the NAS shares to the start of the selection list, but we're still running into 18 hours total backup time (total amount of data on the NAS is roughly 650GB). Any comments and/or suggestions welcome. EDIT: Data is being pulled from the NAS by Backup Exec to a LTO4 tape drive

    Read the article

  • Can Excel sorts be saved and used again?

    - by Robert Kerr
    On an Excel 2007 worksheet, I have several tables, each sharing the same columns. For every table, I sort in several particular ways, depending on the task at hand. It gets tedious going to the Data tab, clicking Sort, unchecking the "my data has headers" checkbox, then add/removing the columns and ordering sort criteria. Is it possible to: * Save a given sort criteria (a named sort)? * Apply the sort against any selected range? * Create a button to execute each saved sort? In the end, I would create 4 or 5 named sorts and a button for each on the worksheet. Then would be able to select any range of rows, from any table, and click one of the sort buttons. The sort would execute.

    Read the article

  • Wget - if / else download condition?

    - by Kai
    I want wget to prefer a certain filetype over another, if the files have the same basename. For example: if foo.ogg available, don't download foo.mp3 the way i use wget so far to crawl/automatically download (if anyone is interested): wget -Dfoo.com -I /folder/ -r -l 1 -nc -A.ogg,.mp3 -i http://www.foo.com/folder/ but this, of course, gets me .mp3 AND .ogg files. It often also gets me image files like .png which i didn't want in the first place, and discards them afterwards. Any Ideas? (Syntax-Explanation: -D: download only from this Domain -I: download only from this subfolder of Domain -r: recursive (follow links and directory structure) -l 1: follow only 1 link deep -nc: no clobber = download only if file doesn't exist -A: accept/download only all *.ogg and *.mp3 (discard necessary html-files) -i: download-url/starting point)

    Read the article

< Previous Page | 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482  | Next Page >