Search Results

Search found 26947 results on 1078 pages for 'util linux'.

Page 503/1078 | < Previous Page | 499 500 501 502 503 504 505 506 507 508 509 510  | Next Page >

  • ps -ef + fourfield from ps -ef output

    - by yael
    I need explain about the four field (4 or 0 ) of ps -ef command , what the meaning of this number THX yael root 27116 27112 4 15:25 pts/0 00:00:00 grep -qsRw -m1 monitohhhhhhhr /var root 29017 27113 0 15:25 pts/0 00:00:00 grep qsRw -m1

    Read the article

  • Android AVD error after update to 2.3

    - by outcast
    (ubuntu8.0.4+eclipse3.6.1+ADT8.0.1) I update android from(1.5+1.6+2.1+2.2) to 2.3,but I got 1:/home/workaccount/Android/android-sdk-linux_86//tools/emulator: /lib/tls/i686/cmov/libc.so.6: version GLIBC_2.8' not found (required by /home/workaccount/Android/android-sdk-linux_86//tools/emulator) 2:/home/workaccount/Android/android-sdk-linux_86//tools/emulator: /lib/tls/i686/cmov/libc.so.6: versionGLIBC_2.11' not found (required by /home/workaccount/Android/android-sdk-linux_86//tools/emulator) when I start an AVD. All versions can't work now. I delete everything expect eclipse,download adt, AndroidSDK, platform2.3/2.2 ,still error. Heeeelp!!!

    Read the article

  • converting .doc and .xls file to html on rhel 4 server using php

    - by prateek
    i have a rhel 4 server and use php as a server side scripting language. There are many doc and excel files that are uploaded to the server daily. i make these files to be downloaded. i want to implement the view as html feature and they should preserve the formatting also. so which tools can be used or it can be done through php only. (on php4)

    Read the article

  • 1 domain.. 2 server and 2 applications

    - by basit.
    i have a site like twitter.com on server one and on server two i have forum, which path is like domain.com/forum on server one i wanted to implement wild card dns and put main domain on it. but on server two i wanted to keep forum separate, i cant give sub-domain forum.domain.com, because all its links are already put in search engines and link back to domain.com/forum. so i was wondering, how can i put domain and wild card dns on server one and still able to give path on server 2 for domain.com/forum (as sub-folder). any ideas? do you think htaccess can do that job? if yes, then how?

    Read the article

  • problem in allocating kernel memory by malloc(),

    - by basu sagar
    Is there any protection provided by kernel? Because when we tried to allocate memory using an malloc(), the kernel allowed to allocated around 124 MB of memory, and when we try to write into it, the kernel crashed. If there was protection of kernel memory area, this wouldn't have happened, I guess.

    Read the article

  • Regex to find external links from the html file using grep

    - by Amar
    hello, From past few days I'm trying to develop a regex that fetch all the external links from the web pages given to it using grep. Here is my grep command grep -h -o -e "\(\(mailto:\|\(\(ht\|f\)tp\(s\?\)\)\)\://\)\{1\}\(.*\?\)" "/mnt/websites_folder/folder_to_search" -r now the grep seem to return everything after the external links in that given line Example if an html file contain something like this on same line GoogleYahoo then the given grep command return the following result http://www.google.com">Google</a><p><a href='https://yahoo.com'>Yahoo</a></p> the idea here is that if an html file contain more than one links(`irrespective in a,img etc`) in same line then the regex should fetch only the links and not all content of that line I managed to developed the same in rubular.com the regex is as follow ("|')(\b((ht|f)tps?:\/\/)(.*?)\b)("|') with work with the above input but iam not able to replicate the same in grep can anyone help I can't modify the html file so don't ask me to do that neither I can look for each specific tags and check their attributes to to get external links as it addup processing time and my application doesn't demand that Thank You

    Read the article

  • error in implementing static files in django

    - by POOJA GUPTA
    my settings.py file:- STATIC_ROOT = '/home/pooja/Desktop/static/' # URL prefix for static files. STATIC_URL = '/static/' # Additional locations of static files STATICFILES_DIRS = ( '/home/pooja/Desktop/mysite/search/static', ) my urls.py file:- from django.conf.urls import patterns, include, url from django.contrib.staticfiles.urls import staticfiles_urlpatterns from django.contrib import admin admin.autodiscover() urlpatterns = patterns('', url(r'^search/$','search.views.front_page'), url(r'^admin/', include(admin.site.urls)), ) urlpatterns += staticfiles_urlpatterns() I have created an app using django which seraches the keywords in 10 xml documents and then return their frequency count displayed as graphical representation and list of filenames and their respective counts.Now the list has filenames hyperlinked, I want to display them on the django server when user clicks them , for that I have used static files provision in django. Hyperlinking has been done in this manner: <ul> {% for l in list1 %} <li><a href="{{STATIC_URL}}static/{{l.file_name}}">{{l.file_name}}</a{{l.frequency_count</li> {% endfor %} </ul> Now when I run my app on the server, everything is running fine but as soon as I click on the filename, it gives me this error : Using the URLconf defined in mysite.urls, Django tried these URL patterns, in this order: ^search/$ ^admin/ ^static\/(?P<path>.*)$ The current URL, search/static/books.xml, didn't match any of these. I don't know why this error is coming, because I have followed the steps required to achieve this. I have posted my urls.py file and it is showing error in that only. I'm new to django , so Please help

    Read the article

  • php cli script hangs with no messages

    - by julio
    Hi-- I've written a PHP script that runs via SSH and nohup, meant to process records from a database and do stuff with them (eg. process some images, update some rows). It works fine with small loads, up to maybe 10k records. I have some larger datasets that process around 40k records (not a lot, I realize, but it adds up to a lot of work when each record requires the download and processing of up to 50 images). The larger datasets can take days to process. Sometimes I'll see in my debug logs memory errors, which are clear enough-- but sometimes the script just appears to "die" or go zombie on me. My tail of the debug log just stops, with no error messages, the tail of the nohup log ends with no error, and the process is still showing in a ps list, looking like this-- 26075 pts/0 S 745:01 /usr/bin/php ./import.php but no work is getting done. Can anyone give me some ideas on why a process would just quit? The obvious things (like a php script timeout and memory issues) are not a factor, as far as I can tell. Thanks for any tips PS-- this is hosted on a godaddy VDS (not my choice). I am sort of suspecting that godaddy has some kind of limits that might kick in on me despite what overrides I put in the code (such as set_time_limit(0);).

    Read the article

  • write error: Broken pipe

    - by Fahim
    Hi, I have to run a tool on around 300 directories. Each run take around 1 minute to 30 minute or even more than that. So, I wrote a python script having a loop to run the tool on all directories one after another. my python script has code something like: for directory in directories: os.popen('runtool_exec ' + directory) But when I run the python script I get the following error messages repeatedly: .. tail: write error: Broken pipe date: write error: Broken pipe .. All I do is login on a remote server using ssh where the tool, python script, and subject directories are kept. When I individually run the tool from command prompt using command like: runtool_exec directory it works fine. "broken pipe" error is coming only when I run using the python script. Any idea, workaround? Please suggest. Thanks. Fahim

    Read the article

  • How to detect segmentation fault details using Valgrind?

    - by Davit Siradeghyan
    Hi all I have a std::map< std::string, std::string which initialized with some API call. When I'm trying to use this map I'm getting segmentation fault. How can I detect invalid code or what is invalid or any detail which can help me to fix problem? Code looks like this: std::map< std::string, std::string> cont; some_func( cont ); // getting parameter by reference and initialize it std::cout << cont[ "some_key" ] << '\n'; // getting segmentation fault here

    Read the article

  • how does ` cat << EOF` work in bash?

    - by hasen j
    I needed to write a script to enter multi-line input to a program (psql) After a big of googling, I found the following syntax works: cat << EOF | psql ---params BEGIN; `pg_dump ----something` update table .... statement ...; END; EOF This correctly concatenates all these strings and passes the result as an input to psql. but I have no idea how/why it works, can some one please explain? I'm referring mainly to cat << EOF, I know > outputs to a file, >> appends to a file, < reads input from file. What does << exactly do? And is there a man page for it?

    Read the article

  • How te execute with /bin/false shell

    - by Amar
    Hello I am trying to setup per-user fastcgi scripts that will run each on different port and with different user. Here is example of my script: #!/bin/bash BIND=127.0.0.1:9001 USER=user PHP_FCGI_CHILDREN=2 PHP_FCGI_MAX_REQUESTS=10000 etc... However, if I add user with /bin/false (which I want, since this is about to be something like shared hosting and I dont want users to have shell access), the script is run'd under 1001, 1002 'user' which, as I googled, might be security hole. My question is: Is it possible to allow user(s) execute shell scripts but disable them to log in via SSH ? Thank you

    Read the article

  • Editing history in bash

    - by nameanyone
    In bash, when I go back in history, edit some command and run it, this edited command is appended to history and the original one is left intact. But every once in a while I somehow manage to affect the original command, i.e. my edit replaces the original command back in history. I can't put my finger on how this happens. Can someone explain? My goal is to avoid this, so any edit to a previous command always gets appended to history and never replaces the original.

    Read the article

  • Convert Chunk of Data into Tabular Format Using Perl

    - by neversaint
    I have a data that looks like this 1:SRX000566 Submitter: WoldLab Study: RNASeq expression profiling for ENCODE project(SRP000228) Sample: Human cell line GM12878(SRS000567) Instrument: Solexa 1G Genome Analyzer Total: 4 runs, 62.7M spots, 2.1G bases Run #1: SRR002055, 11373440 spots, 375323520 bases Run #2: SRR002063, 22995209 spots, 758841897 bases Run #3: SRR005091, 13934766 spots, 459847278 bases Run #4: SRR005096, 14370900 spots, 474239700 bases 2:SRX000565 Submitter: WoldLab Study: RNASeq expression profiling for ENCODE project(SRP000228) Sample: Human cell line GM12878(SRS000567) Instrument: Solexa 1G Genome Analyzer Total: 3 runs, 51.2M spots, 1.7G bases Run #1: SRR002052, 12607931 spots, 416061723 bases Run #2: SRR002054, 12880281 spots, 425049273 bases Run #3: SRR002060, 25740337 spots, 849431121 bases 3:SRX012407 Submitter: GEO Study: GSE17153: Illumina sequencing of small RNAs from C. elegans embryos(SRP001363) Sample: Caenorhabditis elegans(SRS006961) Instrument: Illumina Genome Analyzer II Total: 1 run, 3M spots, 106.8M bases Run #1: SRR029428, 2965597 spots, 106761492 bases Is there a compact way to convert them into tabular format (tab separated). Hence 1 entry/row per chunk. In these case 3 rows. I tried this but doesn't seem to work. perl -laF/\n/ -000ne"print join chr(9),@F" myfile.txt

    Read the article

  • PHP set timeout for script with system call, set_time_limit not working

    - by tehalive
    I have a command-line PHP script that runs a wget request using each member of an array with foreach. This wget request can sometimes take a long time so I want to be able to set a timeout for killing the script if it goes past 15 seconds for example. I have PHP safemode disabled and tried set_time_limit(15) early in the script, however it continues indefinitely. Update: Thanks to Dor for pointing out this is because set_time_limit() does not respect system() calls. So I was trying to find other ways to kill the script after 15 seconds of execution. However, I'm not sure if it's possible to check the time a script has been running while it's in the middle of a wget request at the same time (a do while loop did not work). Maybe fork a process with a timer and set it to kill the parent after a set amount of time? Thanks for any tips! Update: Below is my relevant code. $url is passed from the command-line and is an array of multiple URLs (sorry for not posting this initially): foreach( $url as $key => $value){ $wget = "wget -r -H -nd -l 999 $value"; system($wget); }

    Read the article

  • Problem with UBUNTU 10.4 and ATI

    - by Maximilinao
    hola, ando teniendo un problema al instalar los drivers de la placa de video. Apenas termino de instalar los driver, reinicio mi sistema operativo, y al iniciar mi monitor queda apagado en modo ahorro de energia y UBUNTU no inicia. ¿Que puedo hacer?. Perdon por mi ingles. he usado el traductor de google

    Read the article

  • Killing HTML nodes from shell

    - by hendry
    Need a solution to kill nodes like <footer>foobar</footer> and <div class="nav"></div> from many several HTML files. I want to dump a site to disk without the menus and footers and what not. Ideally I would accomplish this task using basic unix tools like sed. Since it's not XML I can't use xmlstarlet. Could anyone please suggest recipes, so I can ideally have a script running kill-node.sh 'div class="toplinks"' *.html to prune the bits I don't want. Thank you,

    Read the article

  • Sed does not work in expect

    - by Sharjeel Sayed
    I made this bash one-liner which I use to list Weblogic instances running along with their full paths.This works well when I run it from the shell. /usr/ucb/ps auwwx | grep weblogic | tr ' ' '\n' | grep security.policy | grep domain | awk -F'=' '{print $2}' | sed 's/weblogic.policy//' | sed 's/security\///' | sort I tried to incorporate this in an expect script send "echo Weblogic Processes: ; /usr/ucb/ps auwwx | grep weblogic | tr ' ' '\n' | grep security.policy | grep domain | awk -F'=' '{print \$2}' | sed 's/weblogic.policy//' | sed 's/security\///' | sort ; echo ; echo\r" but I got this error sed: -e expression #1, char 13: unknown option to `s' Please help

    Read the article

  • problem in allocating kernal memory by malloc(),

    - by basu sagar
    Is there any protection provided by kernel? Because when we tried to allocate memory using an malloc(), the kernel allowed to allocated around 124 MB of memory, and when we try to write into it, the kernel crashed. If there was protection of kernel memory area, this wouldn't have happened, i guess

    Read the article

< Previous Page | 499 500 501 502 503 504 505 506 507 508 509 510  | Next Page >