Search Results

Search found 8429 results on 338 pages for 'batch processing'.

Page 204/338 | < Previous Page | 200 201 202 203 204 205 206 207 208 209 210 211  | Next Page >

  • For single-producer, single-consumer should I use a BlockingCollection or a ConcurrentQueue?

    - by Jonathan Allen
    For single-producer, single-consumer should I use a BlockingCollection or a ConcurrentQueue? Concerns: * My goal is to pull up to 100 items at a time and send them as a batch to the next step. * If I use a ConcurrentQueue, I have to manually cause it to go asleep when there is no work to be done. Otherwise I waste CPU cycles on spinning. * If I use a BlockingQueue and I only have 99 work items, it could indefinitely block until there the 100th item arrives. http://msdn.microsoft.com/en-us/library/system.collections.concurrent.aspx

    Read the article

  • How can I create a shortcut to uninstall MyApplication using the Visual Studio Installer?

    - by Coder7862396
    I have created an installer for MyApplication using the Visual Studio Installer (VSI) Setup Project. I would like to create a shortcut in the user's Start Menu to uninstall MyApplication (instead of having to go through the Add/Remove Programs control panel). I cannot simply create a batch/script file to run the command "msiexec /uninstall {MyGUID}" because, although it does succeed in running the uninstaller, when the uninstall completes the program folder (wherever the user has chosen to install the app) does not get deleted. It stays empty. How can I create a shortcut that can uninstall the program but not leave behind an empty program folder?

    Read the article

  • Database file is inexplicably locked during SQLite commit

    - by sweeney
    Hello, I'm performing a large number of INSERTS to a SQLite database. I'm using just one thread. I batch the writes to improve performance and have a bit of security in case of a crash. Basically I cache up a bunch of data in memory and then when I deem appropriate, I loop over all of that data and perform the INSERTS. The code for this is shown below: public void Commit() { using (SQLiteConnection conn = new SQLiteConnection(this.connString)) { conn.Open(); using (SQLiteTransaction trans = conn.BeginTransaction()) { using (SQLiteCommand command = conn.CreateCommand()) { command.CommandText = "INSERT OR IGNORE INTO [MY_TABLE] (col1, col2) VALUES (?,?)"; command.Parameters.Add(this.col1Param); command.Parameters.Add(this.col2Param); foreach (Data o in this.dataTemp) { this.col1Param.Value = o.Col1Prop; this. col2Param.Value = o.Col2Prop; command.ExecuteNonQuery(); } } this.TryHandleCommit(trans); } conn.Close(); } } I now employ the following gimmick to get the thing to eventually work: private void TryHandleCommit(SQLiteTransaction trans) { try { trans.Commit(); } catch (Exception e) { Console.WriteLine("Trying again..."); this.TryHandleCommit(trans); } } I create my DB like so: public DataBase(String path) { //build connection string SQLiteConnectionStringBuilder connString = new SQLiteConnectionStringBuilder(); connString.DataSource = path; connString.Version = 3; connString.DefaultTimeout = 5; connString.JournalMode = SQLiteJournalModeEnum.Persist; connString.UseUTF16Encoding = true; using (connection = new SQLiteConnection(connString.ToString())) { //check for existence of db FileInfo f = new FileInfo(path); if (!f.Exists) //build new blank db { SQLiteConnection.CreateFile(path); connection.Open(); using (SQLiteTransaction trans = connection.BeginTransaction()) { using (SQLiteCommand command = connection.CreateCommand()) { command.CommandText = DataBase.CREATE_MATCHES; command.ExecuteNonQuery(); command.CommandText = DataBase.CREATE_STRING_DATA; command.ExecuteNonQuery(); //TODO add logging } trans.Commit(); } connection.Close(); } } } I then export the connection string and use it to obtain new connections in different parts of the program. At seemingly random intervals, though at far too great a rate to ignore or otherwise workaround this problem, I get unhandled SQLiteException: Database file is locked. This occurs when I attempt to commit the transaction. No errors seem to occur prior to then. This does not always happen. Sometimes the whole thing runs without a hitch. No reads are being performed on these files before the commits finish. I have the very latest SQLite binary. I'm compiling for .NET 2.0. I'm using VS 2008. The db is a local file. All of this activity is encapsulated within one thread / process. Virus protection is off (though I think that was only relevant if you were connecting over a network?). As per Scotsman's post I have implemented the following changes: Journal Mode set to Persist DB files stored in C:\Docs + Settings\ApplicationData via System.Windows.Forms.Application.AppData windows call No inner exception Witnessed on two distinct machines (albeit very similar hardware and software) Have been running Process Monitor - no extraneous processes are attaching themselves to the DB files - the problem is definitely in my code... Does anyone have any idea whats going on here? I know I just dropped a whole mess of code, but I've been trying to figure this out for way too long. My thanks to anyone who makes it to the end of this question! brian UPDATES: Thanks for the suggestions so far! I've implemented many of the suggested changes. I feel that we are getting closer to the answer...however... The code above technically works however it is non-deterministic! It is not guaranteed to do anything aside from spin in neutral forever. In practice it seems to work somewhere between the 1st and 10th iteration. If i batch my commits at a reasonable interval damage will be mitigated but I really do not want to leave things in this state... More suggestions welcome!

    Read the article

  • What good software or scripts are available for managing users and subscriptions on our website?

    - by undefined
    hi all, Ok so it's not exactly a programing question but does anyone know or have experience with looking for a system for managing users on a website we are building? what is the shortlist of good feature rich secure solutions. we need Php and mysql integration and payment support for main credit cards. We will also want to be able to track users and generate reports about usage, subscription etc, create and send batch emails etc. It would also be great to have the ability to integrate customer support with this so we can view support tickets raised by users. cheers we are running PHP, mysql on an IIS server

    Read the article

  • PHP exec() and system() functions always return false in IIS

    - by Loftx
    Hi there, I'm trying to use the PHP exec() or system() (or any other similar function) to run a batch file, but I can't seem to get these to return anything. The simplest example I've seen is this, which outputs nothing: <?php echo system('dir'); ?> The script is running on a windows XP machine on IIS with PHP installed and I've also tried it on my shared hosting account running windows 2003 server/IIS. Can anyone suggest what I need to do to get this working, or provide any commands I can use for troubleshooting? Cheers, Tom

    Read the article

  • How can I pass a Visual Studio project's assembly version to another project for use in a post-build

    - by Coder7862396
    I have a solution with 2 projects: My Application 1.2.54 (C# WinForms) My Application Setup 1.0.0.0 (WiX Setup) I would like to add a post-build event to the WiX Setup project to run a batch file and pass it a command line parameter of My Application's assembly version number. The code may look something like this: CALL MyBatchFile.bat "$(fileVersion.ProductVersion($(var.My Application.TargetPath)))" But this results in the following error: Unhandled Exception:The expression """.My Application" cannot be evaluated. Method 'System.String.My Application' not found. C:\My Application\My Application Setup\My Application Setup.wixproj Error: The expression """.My Application" cannot be evaluated. Method 'System.String.My Application' not found. C:\My Application\My Application Setup\My Application Setup.wixproj I would like to be able to pass "1.2.54" to MyBatchFile.bat somehow.

    Read the article

  • How to pass binaries built upstream to a remote downstream build slave

    - by sbi
    We're using Hudson on Windows to build a .NET solution and run the unit tests (NUnit). Hudson is thereby used to start batch files that do the actual work. I am now trying to set up a new test that is to run on a build slave and will run very long. The test should use the binaries produced by the upstream build. I have searched the Hudson documentation but I cannot find how to pass upstream build artifacts to downstream slaves. How do I do this?

    Read the article

  • Determine process using a port, without sudo

    - by pat
    I'd like to find out which process (in particular, the process id) is using a given port. The one catch is, I don't want to use sudo, nor am I logged in as root. The processes I want this to work for are run by the same user that I want to find the process id - so I would have thought this was simple. Both lsof and netstat won't tell me the process id unless I run them using sudo - they will tell me that the port is being used though. As some extra context - I have various apps all connecting via SSH to a server I manage, and creating reverse port forwards. Once those are set up, my server does some processing using the forwarded port, and then the connection can be killed. If I can map specific ports (each app has their own) to processes, this is a simple script. Any suggestions? This is on an Ubuntu box, by the way - but I'm guessing any solution will be standard across most Linux distros.

    Read the article

  • Improve heavy work in a loop in multithreading

    - by xjaphx
    I have a little problem with my data processing. public void ParseDetails() { for (int i = 0; i < mListAppInfo.Count; ++i) { ParseOneDetail(i); } } For 300 records, it usually takes around 13-15 minutes. I've tried to improve by using Parallel.For() but it always stop at some point. public void ParseDetails() { Parallel.For(0, mListAppInfo.Count, i => ParseOneDetail(i)); } In method ParseOneDetail(int index), I set an output log for tracking the record id which is under processing. Always hang at some point, I don't know why.. ParseOneDetail(): 89 ... ParseOneDetail(): 90 ... ParseOneDetail(): 243 ... ParseOneDetail(): 92 ... ParseOneDetail(): 244 ... ParseOneDetail(): 93 ... ParseOneDetail(): 245 ... ParseOneDetail(): 247 ... ParseOneDetail(): 94 ... ParseOneDetail(): 248 ... ParseOneDetail(): 95 ... ParseOneDetail(): 99 ... ParseOneDetail(): 249 ... ParseOneDetail(): 100 ... _ <hang at this point> Appreciate your help and suggestions to improve this. Thank you! Edit 1: update for method: private void ParseOneDetail(int index) { Console.WriteLine("ParseOneDetail(): " + index + " ... "); ApplicationInfo appInfo = mListAppInfo[index]; var htmlWeb = new HtmlWeb(); var document = htmlWeb.Load(appInfo.AppAnnieURL); // get first one only HtmlNode nodeStoreURL = document.DocumentNode.SelectSingleNode(Constants.XPATH_FIRST); appInfo.StoreURL = nodeStoreURL.Attributes[Constants.HREF].Value; } Edit 2: This is the error output after a while running as Enigmativity suggest, ParseOneDetail(): 234 ... ParseOneDetail(): 87 ... ParseOneDetail(): 235 ... ParseOneDetail(): 236 ... ParseOneDetail(): 88 ... ParseOneDetail(): 238 ... ParseOneDetail(): 89 ... ParseOneDetail(): 90 ... ParseOneDetail(): 239 ... ParseOneDetail(): 92 ... Unhandled Exception: System.AggregateException: One or more errors occurred. --- > System.Net.WebException: The operation has timed out at System.Net.HttpWebRequest.GetResponse() at HtmlAgilityPack.HtmlWeb.Get(Uri uri, String method, String path, HtmlDocum ent doc, IWebProxy proxy, ICredentials creds) in D:\Source\htmlagilitypack.new\T runk\HtmlAgilityPack\HtmlWeb.cs:line 1355 at HtmlAgilityPack.HtmlWeb.LoadUrl(Uri uri, String method, WebProxy proxy, Ne tworkCredential creds) in D:\Source\htmlagilitypack.new\Trunk\HtmlAgilityPack\Ht mlWeb.cs:line 1479 at HtmlAgilityPack.HtmlWeb.Load(String url, String method) in D:\Source\htmla gilitypack.new\Trunk\HtmlAgilityPack\HtmlWeb.cs:line 1103 at HtmlAgilityPack.HtmlWeb.Load(String url) in D:\Source\htmlagilitypack.new\ Trunk\HtmlAgilityPack\HtmlWeb.cs:line 1061 at SimpleChartParser.AppAnnieParser.ParseOneDetail(ApplicationInfo appInfo) i n c:\users\nhn60\documents\visual studio 2010\Projects\FunToolPack\SimpleChartPa rser\AppAnnieParser.cs:line 90 at SimpleChartParser.AppAnnieParser.<ParseDetails>b__0(ApplicationInfo ai) in c:\users\nhn60\documents\visual studio 2010\Projects\FunToolPack\SimpleChartPar ser\AppAnnieParser.cs:line 80 at System.Threading.Tasks.Parallel.<>c__DisplayClass21`2.<ForEachWorker>b__17 (Int32 i) at System.Threading.Tasks.Parallel.<>c__DisplayClassf`1.<ForWorker>b__c() at System.Threading.Tasks.Task.InnerInvoke() at System.Threading.Tasks.Task.InnerInvokeWithArg(Task childTask) at System.Threading.Tasks.Task.<>c__DisplayClass7.<ExecuteSelfReplicating>b__ 6(Object ) --- End of inner exception stack trace --- at System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceled Exceptions) at System.Threading.Tasks.Task.Wait(Int32 millisecondsTimeout, CancellationTo ken cancellationToken) at System.Threading.Tasks.Parallel.ForWorker[TLocal](Int32 fromInclusive, Int 32 toExclusive, ParallelOptions parallelOptions, Action`1 body, Action`2 bodyWit hState, Func`4 bodyWithLocal, Func`1 localInit, Action`1 localFinally) at System.Threading.Tasks.Parallel.ForEachWorker[TSource,TLocal](TSource[] ar ray, ParallelOptions parallelOptions, Action`1 body, Action`2 bodyWithState, Act ion`3 bodyWithStateAndIndex, Func`4 bodyWithStateAndLocal, Func`5 bodyWithEveryt hing, Func`1 localInit, Action`1 localFinally) at System.Threading.Tasks.Parallel.ForEachWorker[TSource,TLocal](IEnumerable` 1 source, ParallelOptions parallelOptions, Action`1 body, Action`2 bodyWithState , Action`3 bodyWithStateAndIndex, Func`4 bodyWithStateAndLocal, Func`5 bodyWithE verything, Func`1 localInit, Action`1 localFinally) at System.Threading.Tasks.Parallel.ForEach[TSource](IEnumerable`1 source, Act ion`1 body) at SimpleChartParser.AppAnnieParser.ParseDetails() in c:\users\nhn60\document s\visual studio 2010\Projects\FunToolPack\SimpleChartParser\AppAnnieParser.cs:li ne 80 at SimpleChartParser.Program.Main(String[] args) in c:\users\nhn60\documents\ visual studio 2010\Projects\FunToolPack\SimpleChartParser\Program.cs:line 15

    Read the article

  • SQL Server 2008, not enough disk space

    - by snorlaks
    Hello, I'm executing sql query on my database. I have SQL Server 2008 installed on my D harddrive which has 55 GB free space. I have also C drive which has sth like 150 MB free (right now). While executing that query on quite a big table (16 GB) I have an error: An error occurred while executing batch. Error message is: Not enough disk space. I would like to know if there is any possibility that I can make SQL Server to use D drive instead of C Or maybe there is any other problem with what I'm doing ? Thanks for help

    Read the article

  • Repeatedly querying xml using python

    - by Jack
    I have some xml documents I need to run queries on. I've created some python scripts (using ElementTree) to do this, since I'm vaguely familiar with using it. The way it works is I run the scripts several times with different arguments, depending on what I want to find out. These files can be relatively large (10MB+) and so it takes rather a long time to parse them. On my system, just running: tree = ElementTree.parse(document) takes around 30 seconds, with a subsequent findall query only adding around a second to that. Seeing as the way I'm doing this requires me to repeatedly parse the file, I was wondering if there was some sort of caching mechanism I can use so that the ElementTree.parse computation can be reduced on subsequent queries. I realise the smart thing to do here may be to try and batch as many queries as possible together in the python script, but I was hoping there might be another way. Thanks.

    Read the article

  • Appengine (python) returns empty for valid queries

    - by Grant
    I've got an app with around half a million 'records', each of which only stores three fields. I'd like to look up records by a string field with a query, but I'm running into problems. If I visit the console page, manually view a record and save it (without making changes) it shows up in a query: SELECT * FROM wordEntry WHERE wordStr = 'SomeString' If I don't do this, I get 'no results'. Does appengine need time to update? If so, how much? (I was also having trouble batch deleting and modifying data, but I was able to break the problem up into smaller chunks.)

    Read the article

  • Simple Ubuntu Server - Expanding disk space - adding a new drive LVM, RAID0 existing setup - how?

    - by NightWolf
    I have a 1TB ext4 partition mounted at / with all my data and Ubuntu 11.04 (natty) installed. Now this drive is almost full (I used it as a database server for some processing). RAID0 is ok, I can take a failure (touch wood). But I need a way to grow this partition. I have a new 1TB drive I want to add, however as my Ubuntu boot and all data is on the one partition I'm not sure how I can go about setting up a RAID0 or LVM array without loosing all my data. So the question is how can I extend my existing ext4 partition over two physical drives without losing data? Thanks!

    Read the article

  • How do I analyze an Apache Bench result?

    - by Alan Hoffmeister
    I need some help with analyzing a log from Apache Bench: Benchmarking texteli.com (be patient) Completed 100 requests Completed 200 requests Completed 300 requests Completed 400 requests Completed 500 requests Completed 600 requests Completed 700 requests Completed 800 requests Completed 900 requests Completed 1000 requests Finished 1000 requests Server Software: Server Hostname: texteli.com Server Port: 80 Document Path: /4f84b59c557eb79321000dfa Document Length: 13400 bytes Concurrency Level: 200 Time taken for tests: 37.030 seconds Complete requests: 1000 Failed requests: 0 Write errors: 0 Total transferred: 13524000 bytes HTML transferred: 13400000 bytes Requests per second: 27.01 [#/sec] (mean) Time per request: 7406.024 [ms] (mean) Time per request: 37.030 [ms] (mean, across all concurrent requests) Transfer rate: 356.66 [Kbytes/sec] received Connection Times (ms) min mean[+/-sd] median max Connect: 27 37 19.5 34 319 Processing: 80 6273 1673.7 6907 8987 Waiting: 47 3436 2085.2 3345 8856 Total: 115 6310 1675.8 6940 9022 Percentage of the requests served within a certain time (ms) 50% 6940 66% 6968 75% 6988 80% 7007 90% 7025 95% 7078 98% 8410 99% 8876 100% 9022 (longest request) What this results can tell me? Isn't 27 rps too slow?

    Read the article

  • How to minimize services in ubuntu

    - by codeomnitrix
    Hello everyone.I am using ubuntu from last 6-7 months. I am having 512mb ram and a p4 processor with 1.73 Ghz processing speed. And being a programmer i have to work with eclipse and netbeans like ide's, and they sometimes hang. So is there any option in ubuntu to stop the services running just like i do have in windows "msconfig" or mycomputer-manage, and where could i find the details about the services so that i should know what will be its effect if i stop this service. I am using ubuntu 10.10 Thanks

    Read the article

  • Is File good for Interprocess communication

    - by Karthik
    Hi, I have an EXE and DLL running in different process. From DLL I have to send large of amount of data to EXE, which would vary from 50 chars to 2000 chars and more(The data is recordid of records saved in DB). I thought about two options to do that: 1. Using SendMessage- In which data's will be sent in batch. 2. Use an Intermediate file to transfer data. Can anyone list out the pros and cons of methods. I have developed my components using C#.NET Thanks you folks.

    Read the article

  • Techniques to Monitor cron tasks?

    - by Tristan Juricek
    Are there good techniques for monitoring cron tasks over a cluster? We're starting to use cron to launch tasks at daily intervals. A few ideas for checking out information: Add special application handling that logs information into some "network aware" place, like a DB Build up a logfile system that transfers the cron log periodically to a central point for processing/querying (along with other possible log files) I'm wondering if people have had success with doing things separately for cron versus other things, or, if the tasks were integrated into a different approach completely. I'm leaning towards #2, but I'd like to know what more experienced folk might try out.

    Read the article

  • Unable to set NTFS permissions for ApplicationPoolIdentity on Windows 2008 SP2

    - by Kev
    On Windows 2008 R2 I am able to set NTFS permissions for an application pool's synthesised ApplicationPoolIdentity account thus: ICACLS d:\websites\site1\www /grant "IIS AppPool\site1":(CI)(OI)(M) The website's application pool is named site1 and is configured to run as ApplicationPoolIdentity. The site's authentication is also configured to authenticate as ApplicationPoolIdentity. I've done this a thousand times on Windows 2008 Standard Edition R2 with never a hitch. However if I try to do the same in Windows 2008 Standard Edition SP2 I get the error: IIS AppPool\site1: No mapping between account names and security IDs was done. Successfully processed 0 files; Failed processing 1 files I also notice that this fails if I try to set permissions for the application pool identity via the security GUI as well. I've seen this before and a reboot has cleared this issue but I'd like to know why this happens periodically. Googling around suggests other folks have hit this problem but there's never a satisfactory explanation. Why would this be?

    Read the article

  • Higher speed options for executing very large (20 GB) .sql file in MySQL

    - by Jonogan
    My firm was delivered a 20+ GB .sql file in reponse to a request for data from the gov't. I don't have many options for getting the data in a different format, so I need options for how to import it in a reasonable amount of time. I'm running it on a high end server (Win 2008 64bit, MySQL 5.1) using Navicat's batch execution tool. It's been running for 14 hours and shows no signs of being near completion. Does anyone know of any higher speed options for such a transaction? Or is this what I should expect given the large file size? Thanks

    Read the article

  • S#harp architecture mapping many to many and ado.net data services: A single resource was expected f

    - by Leg10n
    Hi, I'm developing an application that reads data from a SQL server database (migrated from a legacy DB) with nHibernate and s#arp architecture through ADO.NET Data services. I'm trying to map a many-to-many relationship. I have a Error class: public class Error { public virtual int ERROR_ID { get; set; } public virtual string ERROR_CODE { get; set; } public virtual string DESCRIPTION { get; set; } public virtual IList<ErrorGroup> GROUPS { get; protected set; } } And then I have the error group class: public class ErrorGroup { public virtual int ERROR_GROUP_ID {get; set;} public virtual string ERROR_GROUP_NAME { get; set; } public virtual string DESCRIPTION { get; set; } public virtual IList<Error> ERRORS { get; protected set; } } And the overrides: public class ErrorGroupOverride : IAutoMappingOverride<ErrorGroup> { public void Override(AutoMapping<ErrorGroup> mapping) { mapping.Table("ERROR_GROUP"); mapping.Id(x => x.ERROR_GROUP_ID, "ERROR_GROUP_ID"); mapping.IgnoreProperty(x => x.Id); mapping.HasManyToMany<Error>(x => x.Error) .Table("ERROR_GROUP_LINK") .ParentKeyColumn("ERROR_GROUP_ID") .ChildKeyColumn("ERROR_ID").Inverse().AsBag(); } } public class ErrorOverride : IAutoMappingOverride<Error> { public void Override(AutoMapping<Error> mapping) { mapping.Table("ERROR"); mapping.Id(x => x.ERROR_ID, "ERROR_ID"); mapping.IgnoreProperty(x => x.Id); mapping.HasManyToMany<ErrorGroup>(x => x.GROUPS) .Table("ERROR_GROUP_LINK") .ParentKeyColumn("ERROR_ID") .ChildKeyColumn("ERROR_GROUP_ID").AsBag(); } } When I view the Data service in the browser like: http://localhost:1905/DataService.svc/Errors it shows the list of errors with no problems, and using it like http://localhost:1905/DataService.svc/Errors(123) works too. The Problem When I want to see the Errors in a group or the groups form an error, like: "http://localhost:1905/DataService.svc/Errors(123)?$expand=GROUPS" I get the XML Document, but the browser says: The XML page cannot be displayed Cannot view XML input using XSL style sheet. Please correct the error and then click the Refresh button, or try again later. -------------------------------------------------------------------------------- Only one top level element is allowed in an XML document. Error processing resource 'http://localhost:1905/DataServic... <error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata"> -^ I view the sourcecode, and I get the data. However it comes with an exception: <error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata"> <code></code> <message xml:lang="en-US">An error occurred while processing this request.</message> <innererror xmlns="xmlns"> <message>A single resource was expected for the result, but multiple resources were found.</message> <type>System.InvalidOperationException</type> <stacktrace> at System.Data.Services.Serializers.Serializer.WriteRequest(IEnumerator queryResults, Boolean hasMoved)&#xD; at System.Data.Services.ResponseBodyWriter.Write(Stream stream)</stacktrace> </innererror> </error> A I missing something??? Where does this error come from?

    Read the article

  • MySQL dump, output each table row on a new line whilst using --extended-insert

    - by soopadoubled
    I'm having an issue, where for ease of use, I'd like to be able to format a command line MySQL dump so that each row of a given table is on a new line when using the --extended-insert option. Usually when using --extended-insert, every row of a given table is outputted on one line, and as far as I am aware there's no way to change this, other than post-processing the dump with perl or such like. The format I'm looking for is: -- -- Dumping data for table `ww_tbCountry` -- INSERT INTO `ww_tbCountry` (`iCountryId_PK`, `vCountryName`, `vShortName`, `iSortFlag`, `fTax`, `vCountryCode`, `vSageTaxCode`) VALUES (22, 'Albania', 'AL', 1, 0.00, '8', 'T9'), (33, 'Austria', 'AT', 1, 15.00, '40', 'T9'), (40, 'Belarus', 'BY', 1, 0.00, '112', 'T9'), (41, 'Belgium', 'BE', 1, 15.00, '56', 'T9'), (51, 'Bulgaria', 'BG', 1, 15.00, '100', 'T9' However, when I dump a database using Phpmyadmin, using --extended-insert, each row is dumped on a new line (as shown by the example above). I've gone through Phpmyadmin and can't find any documentation that would explain this. Is anyone able to shed any light on this? Thanks in advance, Ian

    Read the article

  • Programmatically Download Image to Desktop from Remote App with Ruby?

    - by viatropos
    I was thinking about making a little crop/resize batch processor online, and wanted to know if there was a way for me to do the following: upload image and specify dimensions click "process" and app resizes image image downloads automatically to wherever it was I uploaded it (say from my desktop), but with a new name (based on the time for example). This would make it so I could host a free image processor that never stored any data other than tempfiles. Is that possible? Something like Rails' send_file method, but I'm using Sinatra and am looking for something in pure ruby.

    Read the article

  • Bash Nested Loops, mixture of dates and numbers

    - by Saleh
    Hi, I am trying to output a range of commands with different dates and numbers associated. for each hour eg. Output im trying to do in a loop is: shell.sh filename<number e.g. between 1-24> <date e.g. 20100928> <number e.g. between 1-24> <id> So basically the the above will generate an output done 24 times for each particular day with a unique 4 digit id. I was thinking of having a nested loop, as the batch number needs to be unique. can anyone help?

    Read the article

  • How do you keep track of what you have released in production?

    - by systempuntoout
    Tipically a deploy in production does not involve just a mere source code update (build) but requires a lot of other important tasks like for example: Db scripts (tables, query..) Configuration files (differents from test\production) Batch to schedule Executables to move to the correct path Etc. etc. In our company we just send an email to a "Release email address" describing the tasks in order, which changeset need to be published (TFS), which SP need to be updated, db scripts and so on. I believe there's not a magic tool that does these tasks automagically in order, rollback included; but probably there's something better than email that helps to keep track of releases in production. Do you have any tools to suggest or practices to share?

    Read the article

  • Filtering Client IP from Access Log for Urchin

    - by Ram Prasad
    I have some apache logs to process, and since the webserver behind two levels of reverse proxies, I am getting two IPs in the X-Forwarded-For header.. for example: 208.34.234.55, 127.0.0.1 - - [29/Oct/2009:21:38:13 -0500] "GET /monkey.html HTTP/1.0" 200 20845 0 0 "http://www.monkey.com/" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.15) Gecko/2009101601 Firefox/3.0.15 (.NET CLR 3.5.30729)" Now, how do I filter this in Urchin (or remove this in Apache logging) so, 127.0.0.1 is removed from processing. Currently urchin is not able to recognize the multuple IP address so it does not log the remote IP

    Read the article

< Previous Page | 200 201 202 203 204 205 206 207 208 209 210 211  | Next Page >