Search Results

Search found 23341 results on 934 pages for 'command history'.

Page 254/934 | < Previous Page | 250 251 252 253 254 255 256 257 258 259 260 261  | Next Page >

  • Google Chrome Extensions: Launch Event (part 2)

    Google Chrome Extensions: Launch Event (part 2) Video Footage from the Google Chrome Extensions launch event on 12/09/09. Aaron Boodman and Erik Kay technical leads for the Google Chrome extensions team present a quick history of the extensions system of Google Chrome and discuss its design principles, focusing on why extensions are webby. From: GoogleDevelopers Views: 3036 12 ratings Time: 05:25 More in Science & Technology

    Read the article

  • bash script to check running process

    - by elasticsecurity
    I wrote a bash-script to check if a process is running. It doesn't work since the ps command always returns exit code 1. When I run the ps command from the command-line, the $? is correctly set, but within the script it is always 1. Any idea? #!/bin/bash SERVICE=$1 ps -a | grep -v grep | grep $1 > /dev/null result=$? echo "exit code: ${result}" if [ "${result}" -eq "0" ] ; then echo "`date`: $SERVICE service running, everything is fine" else echo "`date`: $SERVICE is not running" fi Bash version: GNU bash, version 3.2.25(1)-release (x86_64-redhat-linux-gnu)

    Read the article

  • How do I automatically rebuild the Sphinx index under django-sphinx?

    - by Apreche
    I just setup django-sphinx, and it is working beautifully. I am now able to search my model and get amazing results. The one problem is that I have to build the index by hand using the indexer command. That means every time I add new content, I have to manually hit the command line to rebuild the search index. That is just not acceptable. I could make a cron job that automatically runs the indexer command every so often, but that's far from optimal. New data won't be indexed until the cron runs again. In addition, the indexer will run unnecessarily most times as my site doesn't have data being added very often. How do I set it up so that the Sphinx index will automatically rebuild itself whenever data is added to or modified in a searchable django model?

    Read the article

  • Pass elements of a list as arguments to a function in python

    - by Wilduck
    I'm building a simple interpreter in python and I'm having trouble handling differing numbers of arguments to my functions. My current method is to get a list of the commands/arguments as follows. args = str(raw_input('>> ')).split() com = args.pop(0) Then to execute com, I check to see if it is in my dictionary of command- code mappings and if it is I call the function I have stored there. For a command with no arguments, this would look like: commands[com]() However, if a command had multiple arguments, I would want this: commands[com](args[0],args[1]) Is there some trick where I could pass some (or all) of the elements of my arg list to the function that I'm trying to call? Or is there a better way of implementing this without having to use python's cmd class?

    Read the article

  • Video documentary on the open source culture ?

    - by explorest
    Hello, I'm looking for some videos on these subjects: A movie/documentary detailing the origin, history, and current state of open source culture A movie/documentary on how open source software actually gets developed. What are the technical workflows. How do people create projects, recruit contributors, build a community, assign roles, track issues, assimilate new comers ... etc etc. Could someone suggest a title?

    Read the article

  • USTR's Bully Report Unfairly Blames Canada Again

    <b>Michael Geist:</b> "The U.S. government has released its annual Special 301 report in which it purports to identify those countries with inadequate intellectual property laws. Given the recent history and the way in which the list is developed, it will come as no surprise that the U.S. is again implausibly claiming that Canada is among the worst of the worst"

    Read the article

  • java OutofMemoryError

    - by dqm
    I am running the following command on unix box. java -Xms3800m -Xmx3800m org.apache.xalan.xslt.Process -out Cust.txt -in test13l.xml -xsl CustDetails.xsl It is a java command, which calls Xalan processor to parse through the xml file (test131.xml) using the xsl stylesheet (CustDetails.xsl) and returns Cust.txt. The command works fine and the output is generated. It takes 12 minutes to process an xml file size of 1.1 GB. It takes 22 minutes to process a file size of 1.44 GB. However, when I try to process a file size of 1.66 GB, it errors out with the following message: (Location of error unknown)XSLT Error (java.lang.OutOfMemoryError): null I have increased the java heap size to 3800 not sure what I can do more. Many thanks for your help.

    Read the article

  • DOS Batch script loop

    - by Tom J Nowell
    I need to execute a command 100-200 times, so far my research indicates that I would either have to copy paste 100 copies of this command, OR use a FOR loop, but the for loop expects a list of items, hence I would need 200 files to operate on, or a list of 200 items, hence defeating the point. I would really not have to write a C program and go through the length of documenting why I had to write another program to execute my program for test purposes. Modification of my program itself is also no an option. So, given a command how would I execute it times via a DOS batch script?

    Read the article

  • Installing Sass on Gentoo

    - by iyrag
    I've been trying to install Sass on Gentoo, but it hasn't been going too well. Unfortunately, the latest version of Sass in portage is 3.1.21. What I want to use Sass for requires at least Sass 3.2, which is available through rubygems. What I've tried: emerge dev-ruby/sass (installs an old version) gem install sass The second command appears to install the Sass gem. However, I do not use Rails or Ruby in any other aspect apart from Sass, so the gem appears useless to me. In addition, I do not know where gems are installed to or how to use them (I'm a ruby noob.) All I want to do is call sass from the command line. Are there any ways to obtain an up-to-date version of Sass which I can just use from the command line? Cheers.

    Read the article

  • Serial port determinism

    - by Matt Green
    This seems like a simple question, but it is difficult to search for. I need to interface with a device over the serial port. In the event my program (or another) does not finish writing a command to the device, how do I ensure the next run of the program can successfully send a command? Example: The foo program runs and begins writing "A_VERY_LONG_COMMAND" The user terminates the program, but the program has only written, "A_VERY" The user runs the program again, and the command is resent. Except, the device sees "A_VERYA_VERY_LONG_COMMAND," which isn't what we want. Is there any way to make this more deterministic? Serial port programming feels very out-of-control due to issues like this.

    Read the article

  • how to create a bootable usb on mac to install ubuntu into a formatted PC?

    - by kutayk
    hi I have a working MacBook Pro and a crashed Windows 7 PC. I was running Ubuntu on my PC but after a recurrent issue now I can't boot either OS. I would like to install only ubuntu to my PC and will make Windows 7 history for good. But I am a bit puzzled with cretatinf bootable usb options on the website. If I create a bootable USB on my MAc following the instructions for MAc OS, would it boot in A PC?

    Read the article

  • How to convert thousands of PDF files to a single Postscript file in a specified order

    - by tggagne
    I've discovered multiple options for convert a few to serveral PDFs into Postscript, but many are command-line programs with command-line limitations (this application lives on .NET). Our application generates tens-of-thousands of PDFs that we need to send to a printer, except BEFORE the Postscript is printed we need to edit the Postscript to insert print command instructions (duplex, tray-pulls, highlight color, etc.) I think a perfect solution might allow us to write the PDFs to a stream, and simultaneously allow us to read the output stream so we may edit the Postscript before writing it to a file. Of course, if I must create the file first containing all 10,000 PDFs and edit it in an additional pass, I'm OK with that, too. I should mention that speed is important. I need to print 10,000 at a time, but need to keep the printers busy 24-hours/day.

    Read the article

  • How to verify mail origin?

    - by MrZombie
    I wish to code a little service where I will be able to send an e-mail to a specific address used by my server to send specific commands to my server. I'll check against a list of permitted e-mail addresses to make sure no one unauthorized will send a command to the server, but how do I make sure that, say, an e-mail sent by "[email protected]" really comes from "thezombie.net"? I thought about checking the header for the original e-mail server's IP and pinging the domain to make sure it is the same, but would that be reliable? Example: Server receives a command from [email protected] [email protected] is authorized, proceed with checks Server checks "thezombie.net"'s IP from the header: W.X.Y.Z Server pings "thezombie.net" for it's IP: A.B.C.D The IPs do not correspond, do not process command Is there any better way to do that?

    Read the article

  • Cap deploy doesn't work all the sudden; something to do with FactoryGirl and assets

    - by Jason Swett
    I've been cap deploying my app all throughout it development, and this last time I tried to deploy it, it didn't work. Here's what happened: * executing `deploy:assets:precompile' * executing "cd /var/www/oneteam/releases/20121006153136 && bundle exec rake RAILS_ENV=production RAILS_GROUPS=assets assets:precompile" servers: ["electricsasquatch.com"] [electricsasquatch.com] executing command ** [out :: electricsasquatch.com] rake aborted! ** [out :: electricsasquatch.com] uninitialized constant OneTeam::Application::FactoryGirl ** [out :: electricsasquatch.com] ** [out :: electricsasquatch.com] (See full trace by running task with --trace) It looks like it failed on the deploy:assets:precompile command. I don't get why that command would have tried to do anything with FactoryGirl, though. Any ideas?

    Read the article

  • Does Process.StartInfo.Arguments support a UTF-8 string?

    - by Patrick Klug
    Can you use a UTF-8 string as the Arguments for a StartInfo? I am trying to pass a UTF-8 (in this case a Japanese string) to an application as a console argument. Something like this (this is just an example! (cmd.exe would be a custom app)) var process = new System.Diagnostics.Process(); process.StartInfo.Arguments = "/K \"echo ????????\""; process.StartInfo.FileName = "cmd.exe"; process.StartInfo.UseShellExecute = true; process.Start(); process.WaitForExit(); Executing this seems to loose the UTF-8 string and all the target application sees is "echo ?????????" When executing this command directly on the command line (by pasting the arguments) the target application receives the string correctly even though the command line itself doesn't seem to display it correctly. Do I need to do anything special to enable UTF-8 support in the arguments or is this just not supported?

    Read the article

  • SQL Sentry First Impressions

    - by AjarnMark
    After struggling to defend my SQL Servers from a political attack recently, I realized that I needed better tools to back me up, and SQL Sentry is the leading candidate. A couple of weeks ago, seemingly from out of nowhere, complaints from the business users started coming in that one of the core internal applications was running dramatically slower than normal, and fingers were being pointed at the SQL Server.  Unfortunately, we don’t have a production DBA whose entire job is to monitor and maintain our SQL Servers.  The responsibility falls to me to do the best I can, investing only a small portion of my time, because there are so many other responsibilities to take care of, and our industry is still deep in recession.  I inherited these SQL Servers and have made significant improvements in process and procedure, but I had not yet made the time to take real baseline measurements or keep a really close eye on the performance.  Like many DBAs, I wrote several of my own tools and used the “built-in tools” like Profiler, PerfMon, and sp_who2 (did I mention most of our instances are SQL Server 2000?).  These have all served me well for in-the-moment troubleshooting and maintenance, but they really fell down on the job when I was called upon to “prove” that SQL Server performance was acceptable and more importantly had not degraded recently (i.e. historical comparisons).  I really didn’t have anything from a historical comparison perspective, but I was able to show that current performance was acceptable, and deflect attention back onto other components (which in fact turned out to be the real culprit). That experience dramatically illustrated the need for better monitoring tools.  Coincidentally, I had been talking recently to my boss about the mini nightmare of monitoring several critical and interdependent overnight jobs that operate on separate instances of SQL Server.  Among other tools, I had been using Idera’s SQL Job Manager which is a free tool and did a nice job of showing me job schedules and histories in a nice calendar view.  This worked fairly well, and for the money (did I mention it was free?) it couldn’t be beat.  But it is based on the stored job history in MSDB, and there were other performance problems that we ran into when we started changing the settings for how much job history to retain, in order to be able to look back a month or more in the calendar view.  Another coincidence (if you believe in such things) was that when we had some of those performance challenges, I posted a couple of questions to the #sqlhelp hashtag on Twitter and Greg Gonzalez (@SQLSensei) suggested I check out SQL Sentry’s Event Manager.  At the time, I just thought he worked there, but later found out that he founded the company.  When I took a quick look at the features & benefits, the one that really jumped out at me is Chaining and Queueing which sounded like it would really help with our “interdependent jobs on different servers” issue. I know that is a lot of background story and coincidences, but hopefully you have stuck with me so far, and now we have arrived at the point where last week I downloaded and installed the 30-day trial of the SQL Sentry Power Suite, which is Event Manager plus Performance Advisor.  And I must say that I really like what I see so far.  Here are a few highlights: Great Support.  I had two issues getting the trial setup and monitoring a handful of our servers.  One of which was entirely my fault (missed a security setting in SQL 2008) and the other was mostly my fault (late change to some config settings that were apparently cached and did not get refreshed properly).  In both cases, the support staff at SQL Sentry were very responsive and rather quickly figured out what the cause and fix was for each of them.  This left me with a great impression of the company.  Kudos to them! Chaining and Queueing.  While I have not yet activated this feature, I am very excited about the possibilities.  We have jobs on three different instances of SQL Server that have to be run in a certain order, and each has to finish before the next can successfully begin, and I believe this feature will ensure just that.  It has been a real pain in the backside when one of those jobs runs just a little too long and does not finish before the job on another instance starts, thus triggering a chain reaction of either outright job failures, or worse, successful completion of completely invalid processing. Calendar View.  I really, really like the Event Manager calendar view where I can see all jobs and events across all instances and identify potential resource contention as well as windows of opportunity for maintenance activity.  Very well done, and based on Event Manager’s own database of accumulated historical information rather than querying the source instances every time. Performance Advisor Dashboard History View.  This view let’s me quickly select a date and time range and it displays graphs of key SQL Server and Windows metrics.  This is exactly the thing I needed to answer the “has performance changed recently” question at the beginning of this post. Reporting Services Subscription Jobs with Report Name.  This was a big and VERY pleasant surprise.  If you have ever looked at the list of SQL Server jobs that SQL Server Reporting Services creates when you make a Subscription, you will notice that they all have some sort of GUID as the name of the job.  This is really ugly, and really annoying because when you are just looking at the SQL Agent and Job Activity Monitor, if you see that Job X failed, you really do not have any indication in the name or the properties of the Job itself, as to what Report that was for.  But with SQL Sentry Event Manager you do.  The Jobs list in the Navigator pane in SQL Sentry, amazingly, displays the name of the Report that the Subscription Job is for.  And when you open it to see more details, it shows you the full Reporting Services path to that Report, so you can immediately track it down in the Report Manager in case you want to identify/notify the owner or edit the Subscription information.  I did not expect this at all, but I sure do like it.  HOORAY! That is just my first impressions from using the tools for a few days.  And I haven’t even gotten into how it showed me where I was completely mistaken about one aspect of my SQL Server disk configurations.  I’ll share that lesson in another blog entry.  But I have to say it again, the combination of Event Manager and Performance Advisor working together have really made me a fan.

    Read the article

  • CommandManager Executed Events don't fire for custom ICommands

    - by Andre Luus
    The WPF CommandManager allows you to do the following (pseudo-ish-code): <Button Name="SomeButton" Command="{Binding Path=ViewModelCommand}"/> And in the code-behind: private void InitCommandEvents() { CommandManager.AddExecutedEventHandler(this.SomeButton, SomeEventHandler); } The SomeEventHandler never gets called. To me this didn't seem like something very wrong to try and do, but if you consider what happens inside CommandManager.AddExecutedEventHandler, it makes sense why it doesn't. Add to that the fact that the documentation clearly states that the CommandManager only works with RoutedCommands. Nonetheless, this had me very frustrated for a while and led me to this question: What would you suggest is the best workaround for the fact that the CommandManager does not support custom ICommands? Especially if you want to add behavior around the execution of a command? For now, I fire the command manually in code behind from the button click event.

    Read the article

  • PATH and CLASSPATH in Windows7 7 / Eclipse

    - by Richard Knop
    So I would like to set PATH and CLASSPATH system variables so I can use javac and java commands in the command line. I can just compile and run java programs in eclipse but I would also like to be able to run them through command line. This is where I have Java installed: C:\Program Files (x86)\Java jdk1.6.0_20 jre6 And this is where eclipse stores my Java projects: D:\java-projects HelloWorld bin HelloWorld.class src HelloWorld.java I have set up the PATH and CLASSPATH variables like this: PATH: C:\Program Files (x86)\Java\jdk1.6.0_20\bin CLASSPATH: D:\java-projects But it doesn't work. When I write: java HelloWorld Or: java HelloWorld.class I get error like this: Exception in thread “main” java.lang.NoClassDefFoundError: HelloWorld The error is longer, that's just the first line. How can I fix this? I'm mainly interested to be able to run compiled .class programs from the command line, I can do compiling in the eclipse.

    Read the article

  • New Technology Provides Ample Opportunity

    Throughout the history, technological advances have always offered new opportunities that have had great impact upon society. A perfect example is the invention of the printing press. The impact of this invention upon society is still being felt today nearly 600 years later.

    Read the article

  • How can I redirect the output of Perl's system() to a filehandle?

    - by syker
    With the open command in Perl, you can use a filehandle. However I have trouble getting back the exit code with the open command in Perl. With the system command in Perl, I can get back the exit code of the program I'm running. However I want to just redirect the STDOUT to some filehandle (no stderr). My stdout is going to be a line-by-line output of key-value pairs that I want to insert into a mao in perl. That is why I want to redirect only my stdout from my Java program in perl. Is that possible? Note: If I get errors, the errors get printed to stderr. One possibility is to check if anything gets printed to stderr so that I can quite the Perl script.

    Read the article

  • Best way to implement a Rest API with PHP on Wamp web server

    - by DomingoSL
    Hello, i own a web server running windows (WAMP). I want to know the best way to implement a Rest API (a very simple one) in order to let a user do something. Diagram flow: I have programming skills, in fact, i developed some time ago a web server in VB6 who process the querys and when it find the command (http:/serverIP/webform.php?cmd=run&item=any) it do something, but know i really want to develop a solution using the WAMP server. Some people consider the solution of executing a exe when a command is detected a bad solution for security issues, but this specific proyect i have is for the use of only some people (trusted people) who dont have intentions of hacking the server. So, what do you think? Remember: Its not a public API, its for some people and some programs who will use the API Its a very simple one, only one command using POST or GET. Thanks

    Read the article

  • BizTalk 2009 - SQL Server Job Configuration

    - by StuartBrierley
    Following the installation of Biztalk Server 2009 on my development laptop I used the BizTalk Server Best Practice Analyser which highlighted the fact that two of the SQL Server Agent jobs that BizTalk relies on were not running successfully.  Upon investigation it turned out that these jobs needed to be configured before they would run successfully. To configure these jobs open SQL Server Management Studio, expand SQL Server Agent > Jobs and double click on the appropriate job.  Select Steps and then edit the appropriate entries. Backup BizTalk Server (BizTalkMgmtDb) This job is comprised of three steps BackupFull, MarkAndBackupLog and ClearBackupHistory. BackupFull exec [dbo].[sp_BackupAllFull_Schedule] ‘d’ /* Frequency */,‘BTS’ /* Name */,‘<destination path>’ /* location of backup files */ The frequency here is set/left as daily The name is left as BTS You must provide a full destination path for the backup files to be stored. There are also two optional parameters: A flag that controls if the job forces a full backup if a partial backup fails A parameter to control the time of day to run the full backup; the default is midnight UTC time For example: exec [dbo].[sp_BackupAllFull_Schedule] ‘d’ /* Frequency */,‘BTS’ /* Name */,‘<destination path>’ /* location of backup files */ , 0, 22 MarkAndBackUpLog exec [dbo].[sp_MarkAll] ‘BTS’ /* Log mark name */,’<destination path>’  /*location of backup files */ You must provide a destination path for the log backups. Optionally you can also add an extra parameter that tells the procedure to use local time: exec [dbo].[sp_MarkAll] ‘BTS’ /* Log mark name */,’<destination path>’  /*location of backup files */ ,1 Clear Backup History exec [dbo].[sp_DeleteBackupHistory] @DaysToKeep=7 This will clear out the instances in the MarkLog table older than 7 days.    DTA Purge and Archive (BizTalkDTADb) This job is comprised of a single step. Archive and Purge exec dtasp_BackupAndPurgeTrackingDatabase 0, --@nLiveHours tinyint, 1, --@nLiveDays tinyint = 0, 30, --@nHardDeleteDays tinyint = 0, null, --@nvcFolder nvarchar(1024) = null, null, --@nvcValidatingServer sysname = null, 0 --@fForceBackup int = 0 Any completed instance that is older than the live days plus live hours will be deleted, as will any associated data. Any data older than the HardDeleteDays will be deleted - this means that those long running orchestration instances that would otherwise never be purged will at some point have their data cleared down while allowing the instance to continue, thus preventing the DTA databse from growing indefinitely.  This should always be greater than the soft purge window. The NVC folder is the path for the backup files, if this is null the job will not run failing with the error : DTA Purge and Archive (BizTalkDTADb) Job failed SQL Server Management Studio, job activity monitor, view history The @nvcFolder parameter cannot be null. Archive and Purge step How long you choose to keep instances in the Tracking Database is really up to you. For development I have set this up as: exec dtasp_BackupAndPurgeTrackingDatabase 0, 1, 30, ’<destination path>’, null, 0 On a live server you may want to adjust these figures: exec dtasp_BackupAndPurgeTrackingDatabase 0, 15, 20, ’<destination path>’, null, 0

    Read the article

  • Why doesn't Inno Setup compiler set the version info correctly from hudson?

    - by Tim
    If I run Inno Setup compiler from a command line/batch file it creates an exe with the version information in the file name. However, when I run from hudson (same command line) I don't get the version information. Perhaps I am missing something. Is this a known issue? This is the way I am doing it in the iss script file. #define FileVerStr GetFileVersion(SrcApp) EDIT: The env vars are all set for all users - not just my login - so the service has access to everything that the command line build does. EDIT: See my answer for a resolution of this.

    Read the article

  • VisualBasic.net Database Boiler Plate

    - by Shiftbit
    Is there any built in .net Classes to assist in the reduction of boiler plate code? I have numerous database operations going on and I find that I am reproducing the connection, command, transaction and occassianlly data set. I am aware of the Java Related Question, however, the solutions pertain to Java. I was wondering if anyone was aware of a .net solution? http://stackoverflow.com/questions/1072925/remove-boilerplate-from-db-code Public Sub ReadData(ByVal connectionString As String) Dim queryString As String = "SELECT EmpNo, EName FROM Emp" Using connection As New OracleConnection(connectionString) Dim command As New OracleCommand(queryString, connection) connection.Open() Using reader As OracleDataReader = command.ExecuteReader() ' Always call Read before accessing data. While reader.Read() Console.WriteLine(reader.GetInt32(0).ToString() + ", " _ + reader.GetString(1)) End While End Using End Using End Sub MSDN

    Read the article

< Previous Page | 250 251 252 253 254 255 256 257 258 259 260 261  | Next Page >