Search Results

Search found 27870 results on 1115 pages for 'standard output'.

Page 445/1115 | < Previous Page | 441 442 443 444 445 446 447 448 449 450 451 452  | Next Page >

  • Why does service service_name status give different responses for different services?

    - by Code Ghar
    Running the "service service_name status" command gives three different types of output when three different service_names are used. I tried with atftpd, apache2, and isc-dhcp-server, as shown below. user@host:~$ service atftpd status Usage: /etc/init.d/atftpd {start|stop|restart|reload|force-reload} user@host:~$ service apache2 status Apache2 is running (pid 1103). user@host:~$ service isc-dhcp-server status isc-dhcp-server start/running, process 5696 Could this be because atftpd has not been converted to Upstart? The status for isc-dhcp-server shows "start/running" which indicates it has been converted to use Upstart. I would've thought apache2 would have been converted to Upstart already. If it has then why does it not display "start/running"?

    Read the article

  • Sense of "stop on..." stanza when job is a task

    - by Binarus
    Hi, an upstart question (I think I have read all relevant man pages but could not find the answer there): What is the sense of using a "stop on ..." stanza in the definition of a job which is a task? The manuals tell us that such a job, after being started, just waits until its script (or exec stanza) is executed completely, and then stops automatically. Given that, what is the point in using "stop on ..." stanzas in such job definitions? For example, this is the job definition for Upstart's (very important) rc job in Natty 11.04 (leaving out comments and empty lines): start on runlevel [0123456] stop on runlevel [!$RUNLEVEL] export RUNLEVEL export PREVLEVEL console output env INIT_VERBOSE task exec /etc/init.d/rc $RUNLEVEL IMHO, the job, after being started by a runlevel event, will be stopped automatically as soon as /etc/init.d/rc $RUNLEVEL has finished. Thank you very much for any explanation!

    Read the article

  • Structuring an input file

    - by Ricardo
    I am in the process of structuring a small program to perform some hydraulic analysis of pipe flow. As I am envisioning this, the program will read an input file, store the input parameters in a suitable way, operate on them and finally output results. I am struggling with how to structure the input file in a sane way; that is, in a way that a human can write it easily and a machine can parse it easily. A sample input file made available to me for a similar program is just a stream of comma-separated numbers that don't make much sense on their own, so that's the scenario I am trying to avoid. Though I am giving the details of my particular problem, I am more interested in general input-file structuring strategies. Is a stream of comma-separated values my best bet? Would I be better off using some sort of key:value structure? I don't know much about this, so any help will probably put me in a better track than I am now.

    Read the article

  • Radeon HD5570 HDMI Video Card 5.1 Audio doesn't work

    - by ryandlf
    I am using Ubuntu and XMBC on my HTPC and have chosen the Radeon HD5570 Video card which has an HDMI output. In the sound preferences there is no surround sound option for the video card just stereo and although I can get sound through it in XBMC, my receiver does not state Dolby Digital on movies that are in fact Dolby so its definitely not giving me the true sound it should. Does this card not support surround sound through HDMI and I somehow missed it? If that is the case does anyone have suggestion that has been tested and works? Id like to know its going to work before investing in yet another video card. UPDATE I purchased a Nvidia GeForce GTS 450, plugged it in, downloaded the proprietary driver from the system control panel, disabled the onboard audio from the BIOS (not sure if this was necessary but I did it anyways), and changed the sound settings to use the new video card. Everything works flawlessly. It was a seemless setup.

    Read the article

  • Error when I compile kernel 3.3.2 in ubuntu 12.04

    - by rock-alternativo
    I thinks is not a bug of ubuntu. This is the output: OBJCOPY arch/x86/boot/vmlinux.bin HOSTCC arch/x86/boot/tools/build BUILD arch/x86/boot/bzImage Setup is 16800 bytes (padded to 16896 bytes). System is 4599 kB CRC f77d64c0 Kernel: arch/x86/boot/bzImage is ready (#1) Building modules, stage 2. MODPOST 3268 modules ERROR: "__modver_version_show" [drivers/staging/rts5139/rts5139.ko] undefined! WARNING: modpost: Found 4 section mismatch(es). To see full details build your kernel with: 'make CONFIG_DEBUG_SECTION_MISMATCH=y' make[1]: *** [__modpost] Error 1 make: *** [modules] Error 2

    Read the article

  • Scalability of multi-threading in game server

    - by Taylor Hill
    What is a reasonable number of threads for a simple 2D mmo in Java? Is it reasonable to have two threads per connection, one for the input stream and one for the output stream? The reason I ask is because I use a blocking method on the input stream, and a workaround seems unnecessarily complex if I were to try to get around it without adding threads. This is mostly for my own edification; I don't expect to have 5 million people playing it ever, or even 5, but I'm wondering what a good scalable solution is, and if this is reasonable for a small server (<30 connections).

    Read the article

  • Ubuntu 13.10 change first weekday to Monday in calendar applet

    - by wonderingapple
    Before the update (I was using 13.04), editing: sudo gedit /etc/default/locale so that LC_TIME="en_GB.UTF-8" does the job. However in 13.10, this does not work anymore. I've tried editing: sudo gedit /usr/share/i18n/locales/en_AU sudo gedit /usr/share/i18n/locales/en_GB sudo gedit /usr/share/i18n/locales/en_US so that first_weekday 2 in each of the files, but this also does not work. As a reference, when I run locale, the output is LANG=en_AU.UTF-8 LANGUAGE=en_AU:en LC_CTYPE="en_AU.UTF-8" LC_NUMERIC=en_AU.UTF-8 LC_TIME=en_AU.UTF-8 LC_COLLATE="en_AU.UTF-8" LC_MONETARY=en_AU.UTF-8 LC_MESSAGES="en_AU.UTF-8" LC_PAPER=en_AU.UTF-8 LC_NAME=en_AU.UTF-8 LC_ADDRESS=en_AU.UTF-8 LC_TELEPHONE=en_AU.UTF-8 LC_MEASUREMENT=en_AU.UTF-8 LC_IDENTIFICATION=en_AU.UTF-8 LC_ALL= Please help.

    Read the article

  • Help! Installer crashing

    - by Mike
    I had ubuntu 12.04 installed on a Satellite l635. I wanted to start over with a fresh install because I had experimented with a bunch of different things, and computer was getting glitchy and random freezes. I used the same iso I used for original install, and it keeps say the installer crashed with a erron5 input/output error. Tried an older iso for 11.10 I had used in the past with same issue... To see if it was disc or computer (not a great idea) I tried the 12.04 iso on another laptop. Now I have 2 laptops with no os. Both discs however live boot on both systems. PLEASE HELP! I've searched forums and can't find the same problem.

    Read the article

  • Why is Web SQL database deprecated?

    - by user221287
    I am making a hybrid Android app. At first I decided to use localStorage, after spending 2 days, I realized that it is very strange and so dropped it. Then, I picked up indexedDB, after spending today's whole day and actually getting the output in Google Chrome, it is not running inside a WebView of the android app. And I never used Web SQL database at all because it was deprecated. Anyhow, it has come to my notice that PhoneGap still uses Web SQL and android's browsers support it. Why was Web SQL deprecated in the first place? And will it be a good idea for me to go with Web SQL now?

    Read the article

  • W520 External monitor setup with Ubuntu 12.10

    - by user108372
    I just installed a fresh Ubuntu 12.10 64-Bit Desktop on my Lenovo W520. It looks like there are a lot of challenges around making it work with out of the box Nouveau drivers or propriety Nvidia drivers or Intel GPU. I looked at couple of notes on how to make it work with Bumblebee with Optimus Nvidia. None of them seems to work for 12.10. Anybody has a solid answer on this? It seems like a lot of people are suffering from this. Here is my xrandr output. Let me know if you need any additional information. Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 8192 x 8192 LVDS1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 193mm 1920x1080 60.0*+ 59.9 50.0 1680x1050 60.0 59.9 1600x1024 60.2 1400x1050 60.0 1280x1024 60.0 1440x900 59.9 1280x960 60.0 1360x768 59.8 60.0 1152x864 60.0 1024x768 60.0 800x600 60.3 56.2 640x480 59.9 VGA1 disconnected (normal left inverted right x axis y axis) Thanks, Sef

    Read the article

  • xdebug 2.2.1 installed but not working with cgi

    - by ts01
    I've installed (via pecl) xdebug. It is installed (as phpinfo() output indicates), but it doesn't seems to work with CGI (with CLI it works). I've restarted apache, without result. Any ideas? Some config details (as parsed by http://xdebug.org/wizard.php) Xdebug installed: 2.2.1 Server API: Apache 2.0 Handler Windows: no Zend Server: no PHP Version: 5.3.10-1 Zend API nr: 220090626 PHP API nr: 20090626 Debug Build: no Thread Safe Build: no Configuration File Path: /etc/php5/apache2 Configuration File: /etc/php5/apache2/php.ini Extensions directory: /usr/lib/php5/20090626+lfs

    Read the article

  • Upgrade tree to 1.6?

    - by Pureferret
    I'm trying to upgrade my version of tree to 1.6 on ubuntu 12.04. I've d'loaded, ran make and make install in the terminal using the sudo command. ~/tree-1.6.0$ sudo make make: Nothing to be done for `all'. I've already run sudo make here ~/tree-1.6.0$ sudo make install install -d /usr/bin install -d /usr/man/man1 if [ -e tree ]; then \ install -s tree /usr/bin/tree; \ fi install doc/tree.1 /usr/man/man1/tree.1 What's this output though? It's not updated. I've checked the man page, and -du doesn't work. How am I supposed to update tree if not via the terminal?

    Read the article

  • How to implement custom texture formats in Android?

    - by random1337
    What I know: Android can load PNG, BMP, WEBP,... via BitmapFactory. What I want to achive: Load my own 2D file format (e.g. 1-bit texture with a 1-bit alpha channel) and output a RGBA8888 texture. Question: Is there any interface to achieve this?(or any other way) The resulting image is used as a texture for a 3D model. Why would you do that? Saving phone memory and download bandwidth while expanding the texture at runtime to RAM seems reasonable for very simple textures.

    Read the article

  • Bad sound quality and headphones not working

    - by wifi
    Using Ubuntu 10.10, on a HP Pavilion t3019.es, which has a Realtek ALC880 soundcard. It has 6 rear jack outputs, plus digital audio input and output, plus 3 front jacks (mic, headphones and a blue one which i don't know what's for). The sound on my computer is very low, and when i raise the volume up to 50%, it starts sounding distorted, crackling. Also, the headphones don't work when i plug them (it just keeps on playing through the speakers). I tried to comment the "/etc/modprobe.d/alsa-base.conf" file according to the soundcard and jacks in my computer, but none of the lines added worked (naturally, didn't added them at once). I found out that adding "options snd-hda-intel model=generic" to it made the sound better, but it's not as good as in Windows yet. Any ideas? Other than setting the PCM value, didn't work for me. Thanks.

    Read the article

  • disable intel gpu in ubuntu 12.04

    - by small_potato
    I am wondering if there is anything to disable the intel gpu on ubuntu 12.04. I want to be able to setup dual monitor using nvidia-settings. It seems the intel gpu is used for display as suggested by sudo lshw -c display the output is *-display description: VGA compatible controller product: NVIDIA Corporation vendor: NVIDIA Corporation physical id: 0 bus info: pci@0000:01:00.0 version: a1 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress vga_controller bus_master cap_list rom configuration: driver=nvidia latency=0 resources: irq:16 memory:c0000000-c0ffffff memory:90000000-9fffffff memory:a0000000-a1ffffff ioport:4000(size=128) memory:a2000000-a207ffff *-display description: VGA compatible controller product: Haswell Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 06 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:47 memory:c2000000-c23fffff memory:b0000000-bfffffff ioport:5000(size=64) I have a lenovoY410 with GT750M. It seems there is no way to turn off the intel gpu in bios either. Help please. Thanks.

    Read the article

  • The Orchard Project Planting a Few Seeds

    Orchard is a free, open source, community-focused project aimed at delivering applications and reusable components on the ASP.NET platform. The broad vision of the orchard project is to grow the ASP.NET open source community and partner with existing application authors to help them achieve their goal. The intended output of the Orchard project is three-fold: Individual .NET-based applications that appeal to end-users, scripters, and developers A set of re-usable components that...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • does unused vertices in a 3D object affect performance?

    - by Gajet
    For my game I need to generate a mesh dynamically. now I'm wondering does it have a noticeable affect in fps if I allocate more vertices than what I'm actually using or not? and does it matter if I'm using DirectX or OpenGL? edit final output will be a w*h cell grid, but for technical issues it's much more easier for me to allocate (w+1)*(h+1) vertices. sure I'll only use w*h vertices in indexing, and I know there is some memory wasting there, but I want to know if it also affect fps or not? (note that mesh is only generated once in each time you play the game)

    Read the article

  • No sound after updating from 11.10 to 12.04

    - by shaneo
    Hi I updated a friends computer from 11.10 to 12.04 and the sound stopped working in the built-in speakers. The computer is a Sony Vaio PCG-2J1L. I tried using the guide found here: https://help.ubuntu.com/community/HdaIntelSoundHowto and still no luck. $cat /proc/asound/card0/codec* | grep Codec Codec: Realtek ALC889 $cat /proc/asound/card0/pcm0c/info card: 0 device: 0 subdevice: 0 stream: CAPTURE id: ALC889 Analog name: ALC889 Analog subname: subdevice #0 class: 0 subclass: 0 subdevices_count: 1 subdevices_avail: 1 cat /proc/asound/card0/codec* output can be found here http://paste.ubuntu.com/1043188/ If anymore information is needed just let me know and ill post it here. Thank you in advance for any help with this issue.

    Read the article

  • Switching to an external display, when primary is broken

    - by Shazzner
    I've successfully install Ubuntu 11.10 Desktop (x86) on an old(er) laptop that unfortunately has a broken screen, so there is an external monitor plugged in. On the livecd it come up on the secondary display just fine and I was able to install ubuntu and everything. Unfortunately when I reboot into Ubuntu proper now, the secondary display is off and I'm literally driving blind here trying to switch it to the secondary display. Using Nvidia open source drivers. Things I've tried: Rebooting back into livecd, mounting the partition and trying in vain to find a config file (it uses the open source drivers so no Xorg.conf I could edit manually) Trying to blind-type xrandr settings into what I hope is terminal: xrandr --output VGA1 --auto (nothing happened) Trying to blind install openssh-server so I could ssh into it and maybe configure it from my working computer. For some reason though, no luck. Ubuntu really should default to expanding to all screens for this use case.

    Read the article

  • Screen Resolution Problem with Ubuntu 14.04 and VirtualBox

    - by user3341257
    Environment: Lenovo T530 running Windows 7. Have installed Ubuntu 14.04 on a virtual machine using VM Virtual Box. Have installed all the updates from both Virtual Box and Ubuntu. Problem: While in Ubuntu's desktop and other Ubuntu initiated programs, the window is reduced to about 3x4 inches showing in the middle of the rest of my regular Virtual Box window. I am seeing only the upper right hand of the screen output of what I would normally see. Please help. I've seen How do I install Guest Additions in a VirtualBox VM? But, none of these answers works in 14.04.

    Read the article

  • Unable to start GUI app from upstart

    - by novice
    As part of post-start of my app say "mydaemon" i want to launch a gui app say "mygui" I am unable to do this. I have verified user perm using xhost, DISPLAY variable is set correctly. conf file in /etc/init/ is given below me@ubuntu:~/term$ cat /etc/init/agentd.conf description "my daemon" author "me" start on runlevel [2345] stop on runlevel [016] console output kill timeout 60 respawn respawn limit 3 15 Allow some clean up time post-stop script env DISPLAY=:0.0 cd /home/me ./mygui sleep 1 end script script cd /home/me ./myapp end script post-start script env DISPLAY=:0.0 cd /home/me ./mygui end script sdn@ubuntu:~/term$ any suggestions?

    Read the article

  • Tweaking Hudson memory usage

    - by rovarghe
    Hudson 3.1 has some performance optimizations that greatly reduces its memory footprint. Prior to this Hudson used to always hold the entire data model (all jobs and all builds) in memory which affected scalability. Some installations configured heap sizes in excess of 1GB to counteract this. Hudson 3.1.x maintains an MRU cache and only loads jobs and builds as they are required. Because of the inability to change existing APIs and be backward compatible with plugins, there were limits to how far we could go with this approach. Memory optimizations almost always come with a related cost, in this case its additional I/O that has to be performed to load data on request. On a small site that has frequent traffic, this is usually not noticeable since the MRU cache will usually hold on to all the data. A large site with infrequent traffic might experience some delays when the first request hits the server after a long gap. If you have a large heap and are able to allocate more memory, the cache settings can be adjusted to take advantage of this and even go back to pre-3.1 behavior. All the cache settings can be passed as options to the JVM container (Tomcat or the default Jetty container) using the -D option. There are two caches, independant of each other, one for Jobs and the other for Builds. For the jobs cache: hudson.jobs.cache.evict_in_seconds ( default=60 ) Seconds from last access (could be because of a servlet request or a background cron thread) a job should be purged from the cache. Set this to 0 to never purge based on time. hudson.jobs.cache.initial_capacity ( default=1024 ) Initial number of jobs the cache can accomodate. Setting this to the number of jobs you typically display on your Hudson landing page or home page will speed up consecutive access to that page. If the default is too large you may consider downsizing and using that memory for the Builds cache instead. hudson.jobs.cache.max_entries ( default=1024) Maximum number of jobs in the cache. The default is large enough for most installations, but if you find I/O activity when always accessing the hudson home page you might consider increasing this, but first verify if the I/O is caused by frequent eviction (see above), rather than by the cache not being large enough. For the builds cache: The builds cache is used to store Build objects as they are read from storage. Typically this happens when a user drills down into the details of a particular Job from the hudson hom epage. The cache is shared among builds for different jobs since in most installations all jobs are not accessed with the same frequency, so a per-job builds cache would be a waste of memory. hudson.job.builds.cache.evict_in_seconds ( default=60 ) Same as the equivalent Job cache, applied to Build. hudson.job.builds.cache.initial_capacity" ( default=512 ) Same as equivalent Job cache setting. Note the smaller initial size. If your site stores a large number of builds and has frequent access to more builds you might consider bumping this up. hudson.job.builds.cache.max_entries ( default=10240 ) The default max is large enough for most installations, the builds cache has bigger sized objects, so be careful about increasing the upper limit on this. See section on monitoring below. Sample usage: java -jar hudson-war-3.1.2-SNAPSHOT.war -Dhudson.jobs.cache.evict_in_seconds=300 \ -Dhudson.job.builds.cache.evict_in_seconds=300 Monitoring cache usage The 'jmap' tool that comes with the JDK can be used to monitor cache performance in an indirect way by looking at the number of Job and Build objects in each cache. Find the PID of the hudson instance and run $ jmap -histo:live <pid | grep 'hudson.model.*Lazy.*Key$' Here's a sample output: num #instances #bytes class name 523: 28 896 hudson.model.RunMap$LazyRunValue$Key 1200: 3 96 hudson.model.LazyTopLevelItem$Key These are the keys to the Jobs (LazyTopLevelItem$Key) and Builds (RunMap$LazyRunValue$Key) in the caches, so counting the number of keys is a good indicator of the number of items in the cache at any given moment. The size in bytes can be ignored, they are just the size of the keys, not the actual sizes of the objects they hold. Those sizes can only be obtained with a profiler. With the output above we can conclude that there are 3 jobs and 28 builds in memory. The 28 builds can all be from 1 job or all 3 jobs. Over time on an idle system, these should get evicted and memory cache should be empty. In practice, because of background cron threads and triggers, jobs rarely fall down to zero. Access of a job or a build by a cron thread resets the eviction timer.

    Read the article

  • Problem with the screen resolution on Ubuntu 12.04

    - by sveinn
    I just installed ubuntu on my laptop. The screen resolution is stuck in 1024x768. The screen is made for 1280x800. When I run xrandr I get: xrandr: Failed to get size of gamma for output default Screen 0: minimum 800 x 600, current 1024 x 768, maximum 1024 x 768 default connected 1024x768+0+0 0mm x 0mm 1024x768 61.0* 800x600 61.0 1280x800 isn't offered and I get gamma size error. I was going to look into the Xorg.conf file but I couldn't locate it. 1280x800 was displayed in Windows 7 and I think it is being displayed in Grub before ubuntu starts also. Here are some details about my computer: CPU Intel atom D2500 1.86GHz Chipset Intel 945GSE+ICH7M LCD 14" TFT 16:9 Resolution ratio 1280*800 Video Card Intel integrated GMA950 Does anyone know how to fix this?

    Read the article

  • ASP.NET Menu, NavBar And Pager Performance Improvements v2010 vol 1

    Check out the improvements weve made to some of our ASP.NET controls in the DXperience v2010.1 release. We changed the rendering of our ASP.NET AJAX Menu, Navigation Pane and Pager controls. The controls now use semantic rendering combined with advanced CSS styles, which results in a dramatic decrease of HTML output, improved performance and a reduction in the servers workload. Also, several of our other ASP.NET controls like the ASPxGridView and ASPxScheduler also benefit because The primary...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • No Sound via HDMI

    - by Goony Hill
    I have ubuntu installed on a Acer aspire revo 3700 intel atom processor with nvidia ion graphics this is plugged into a celcus 32 inch TV via HDMI (1080p). The video driver shows as an nvidia which I can select. I have set the sound to play via HDMI and the output to HDMI but get no sound. I have tried a sony 1080i TV with the box but get eratic results with the graphics, but the sound picks up straight away that is there is no need to select it. The graphics on the celcus TV work but I get a dialog box showing loads of different resolutions and frequencies which I have to close manually, these appear to be attempts to set different resolutions for the TV. Am I missing some sort of screen/sound driver, if so does anyone know what might support the celcus 32 inch (1080p) tv?

    Read the article

< Previous Page | 441 442 443 444 445 446 447 448 449 450 451 452  | Next Page >