Search Results

Search found 25093 results on 1004 pages for 'console output'.

Page 440/1004 | < Previous Page | 436 437 438 439 440 441 442 443 444 445 446 447  | Next Page >

  • How do I check whether partitions on my SSD are properly aligned?

    - by elementz
    I just installed ubuntu on my new intel SSD. Now I am not sure, whether paritions are properly aligned in respect to my specific SSD. Here's my fdisk output. $ fdisk -l Platte /dev/sda: 120.0 GByte, 120034123776 Byte 255 Köpfe, 63 Sektoren/Spur, 14593 Zylinder Einheiten = Zylinder von 16065 × 512 = 8225280 Bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x000a6294 Gerät boot. Anfang Ende Blöcke Id System /dev/sda1 * 1 1913 15360000 83 Linux /dev/sda2 1913 14058 97558528 83 Linux /dev/sda3 14058 14594 4300800 82 Linux Swap / Solaris Also, do I still need to align my SSD at all, since I am using TRIM on the ext4 partitions by mounting them with the discard flag. If it is the case, that my partitions are not properly aligned, what could I do to fix this without having to reinstall everything?

    Read the article

  • How to install Radeon X600 driver

    - by mmrs151
    I am completely new to Ubuntu and now running into so many problems regarding my display. I will be so grateful to the community to help me out and keep me loving Ubuntu. When I run lspci | grep VGA I get the following output 01:00.0 VGA compatible controller: ATI Technologies Inc RV380 [Radeon X600 (PCIE)] I am not sure if it means the driver is installed. Now the problem is I can not change the resolution like I can do in windows. I wanted to use dual monitor but it can not detect dual monitor. The monitor option in the preference shows 'Unknown' I tried to use a VGA switch for my ps3 and computer, but the computer is stuck with 1024x768 resolution. I have been trying for last three days, couldn't figure out anything. Now asking for you wisdom to get my display work. Regards, MMRAHMAN

    Read the article

  • how do i make maximum minimum and average score statistic in this code? [on hold]

    - by goldensun player
    i wanna out put the maximum minimum and average score as a statistic category under the student ids and grades in the output file. how do i do that? here is the code: #include "stdafx.h" #include <iostream> #include <string> #include <fstream> #include <assert.h> using namespace std; int openfiles(ifstream& infile, ofstream& outfile); void Size(ofstream&, int, string); int main() { int num_student = 4, count, length, score2, w[6]; ifstream infile, curvingfile; char x; ofstream outfile; float score; string key, answer, id; do { openfiles(infile, outfile); // function calling infile >> key; // answer key for (int i = 0; i < num_student; i++) // loop over each student { infile >> id; infile >> answer; count = 0; length = key.size(); // length represents number of questions in exam from exam1.dat // size is a string function.... Size (outfile, length, answer); for (int j = 0; j < length; j++) // loop over each question { if (key[j] == answer[j]) count++; } score = (float) count / length; score2 = (int)(score * 100); outfile << id << " " << score2 << "%"; if (score2 >= 90)//<-----w[0] outfile << "A" << endl; else if (score2 >= 80)//<-----w[1] outfile << "B" << endl; else if (score2 >= 70)//<-----w[2] outfile << "C" << endl; else if (score2 >= 60)//<-----w[3] outfile << "D" << endl; else if (score2 >= 50)//<-----w[4] outfile << "E" << endl; else if (score2 < 50)//<-----w[5] outfile << "F" << endl; } cout << "Would you like to attempt a new trial? (y/n): "; cin >> x; } while (x == 'y' || x == 'Y'); return 0; } int openfiles(ifstream& infile, ofstream& outfile) { string name1, name2, name3, answerstring, curvedata; cin >> name1; name2; name3; if (name1 == "exit" || name2 == "exit" || name3 == "exit" ) return false; cout << "Input the name for the exam file: "; cin >> name1; infile.open(name1.c_str()); infile >> answerstring; cout << "Input the name for the curving file: "; cin >> name2; infile.open(name2.c_str()); infile >> curvedata; cout << "Input the name for the output: "; cin >> name3; outfile.open(name3.c_str()); return true; } void Size(ofstream& outfile, int length, string answer) { bool check;// extra answers, lesser answers... if (answer.size() > length) { outfile << "Unnecessary extra answers"; } else if (answer.size() < length) { outfile << "The remaining answers are incorrect"; } else { check = false; }; } and how do i use assert for preconditions and post conditional functions? i dont understand this that well...

    Read the article

  • Scalability of multi-threading in game server

    - by Taylor Hill
    What is a reasonable number of threads for a simple 2D mmo in Java? Is it reasonable to have two threads per connection, one for the input stream and one for the output stream? The reason I ask is because I use a blocking method on the input stream, and a workaround seems unnecessarily complex if I were to try to get around it without adding threads. This is mostly for my own edification; I don't expect to have 5 million people playing it ever, or even 5, but I'm wondering what a good scalable solution is, and if this is reasonable for a small server (<30 connections).

    Read the article

  • Triple monitors on hybrid video system GeForce+Intel

    - by v_mil
    I use Lenovo Ideapad Z580A with hybrid video: GeForce+Intel with Ubuntu 12.10 x64 Ukrainian. It has internal display and two outputs: HDMI and VGA. When I connect third display to VGA all displays go black. Pressing alt+backspace and login causes output to two displays: internal and VGA. System Settings - Displays (or monitors - I have Ukrainian interface) shows three displays: two are on, one (DVI) is off. Turning DVI on and pressing apply causes an error: Can not set configuration of controller CRTC 65. BIOS setting is Optimus (two video cards). Driver for GeForce is Nouveau. With best regards. Viktor.

    Read the article

  • How to get nicer error-messages in this bash-script?

    - by moata_u
    I'm trying to catch any error when run a command in order to write a log-file / report I've tried this code: function valid (){ if [ $? -eq 0 ]; then echo "$var1" ": status : OK" else echo "$var1" ": status : ERROR" fi } function save(){ sed -i "/:@/c connection.url=jdbc:oracle:thin:@$ip:1521:$dataBase" $search var1="adding database ip" valid $var1 sed -i "/connection.username/c connection.username=$name" #$search var1="addning database SID" valid $var1 } save The output looks like this: adding database ip : status : OK sed: no input file But I want it to look like this: adding database ip : status : OK sed: no input file : status : ERROR" or this: adding database ip : status : OK addning database SID : status : ERROR" I've been trying, but it's not working with me. :(

    Read the article

  • Error when I compile kernel 3.3.2 in ubuntu 12.04

    - by rock-alternativo
    I thinks is not a bug of ubuntu. This is the output: OBJCOPY arch/x86/boot/vmlinux.bin HOSTCC arch/x86/boot/tools/build BUILD arch/x86/boot/bzImage Setup is 16800 bytes (padded to 16896 bytes). System is 4599 kB CRC f77d64c0 Kernel: arch/x86/boot/bzImage is ready (#1) Building modules, stage 2. MODPOST 3268 modules ERROR: "__modver_version_show" [drivers/staging/rts5139/rts5139.ko] undefined! WARNING: modpost: Found 4 section mismatch(es). To see full details build your kernel with: 'make CONFIG_DEBUG_SECTION_MISMATCH=y' make[1]: *** [__modpost] Error 1 make: *** [modules] Error 2

    Read the article

  • Why can't I access hw:1,0 until gstreamer-properties is run once?

    - by Shadd
    Not necessarily an Ubuntu-specific question but I wasn't sure where else to ask. I have an AverMedia DVD EZMaker 7 which plugs into USB and works well on Ubuntu 12.04. I downloaded and installed the official drivers without error. However, when I try: gst-launch alsasrc device="hw:1,0" ! alsasink device="hw:0,0" it tells me: Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstAudioSrcClock which is the normal output, however there is no audible sound. Trying the command again doesn't help. If I run gstreamer-properties and close it right away (don't need to touch any controls), THEN the gst-launch command works. If I unplug the device and plug it back in or restart the computer, I have to run gstreamer-properties again. What is gstreamer-properties doing that enables the audio?

    Read the article

  • Help! Installer crashing

    - by Mike
    I had ubuntu 12.04 installed on a Satellite l635. I wanted to start over with a fresh install because I had experimented with a bunch of different things, and computer was getting glitchy and random freezes. I used the same iso I used for original install, and it keeps say the installer crashed with a erron5 input/output error. Tried an older iso for 11.10 I had used in the past with same issue... To see if it was disc or computer (not a great idea) I tried the 12.04 iso on another laptop. Now I have 2 laptops with no os. Both discs however live boot on both systems. PLEASE HELP! I've searched forums and can't find the same problem.

    Read the article

  • Ubuntu 13.10 change first weekday to Monday in calendar applet

    - by wonderingapple
    Before the update (I was using 13.04), editing: sudo gedit /etc/default/locale so that LC_TIME="en_GB.UTF-8" does the job. However in 13.10, this does not work anymore. I've tried editing: sudo gedit /usr/share/i18n/locales/en_AU sudo gedit /usr/share/i18n/locales/en_GB sudo gedit /usr/share/i18n/locales/en_US so that first_weekday 2 in each of the files, but this also does not work. As a reference, when I run locale, the output is LANG=en_AU.UTF-8 LANGUAGE=en_AU:en LC_CTYPE="en_AU.UTF-8" LC_NUMERIC=en_AU.UTF-8 LC_TIME=en_AU.UTF-8 LC_COLLATE="en_AU.UTF-8" LC_MONETARY=en_AU.UTF-8 LC_MESSAGES="en_AU.UTF-8" LC_PAPER=en_AU.UTF-8 LC_NAME=en_AU.UTF-8 LC_ADDRESS=en_AU.UTF-8 LC_TELEPHONE=en_AU.UTF-8 LC_MEASUREMENT=en_AU.UTF-8 LC_IDENTIFICATION=en_AU.UTF-8 LC_ALL= Please help.

    Read the article

  • W520 External monitor setup with Ubuntu 12.10

    - by user108372
    I just installed a fresh Ubuntu 12.10 64-Bit Desktop on my Lenovo W520. It looks like there are a lot of challenges around making it work with out of the box Nouveau drivers or propriety Nvidia drivers or Intel GPU. I looked at couple of notes on how to make it work with Bumblebee with Optimus Nvidia. None of them seems to work for 12.10. Anybody has a solid answer on this? It seems like a lot of people are suffering from this. Here is my xrandr output. Let me know if you need any additional information. Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 8192 x 8192 LVDS1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 193mm 1920x1080 60.0*+ 59.9 50.0 1680x1050 60.0 59.9 1600x1024 60.2 1400x1050 60.0 1280x1024 60.0 1440x900 59.9 1280x960 60.0 1360x768 59.8 60.0 1152x864 60.0 1024x768 60.0 800x600 60.3 56.2 640x480 59.9 VGA1 disconnected (normal left inverted right x axis y axis) Thanks, Sef

    Read the article

  • Radeon HD5570 HDMI Video Card 5.1 Audio doesn't work

    - by ryandlf
    I am using Ubuntu and XMBC on my HTPC and have chosen the Radeon HD5570 Video card which has an HDMI output. In the sound preferences there is no surround sound option for the video card just stereo and although I can get sound through it in XBMC, my receiver does not state Dolby Digital on movies that are in fact Dolby so its definitely not giving me the true sound it should. Does this card not support surround sound through HDMI and I somehow missed it? If that is the case does anyone have suggestion that has been tested and works? Id like to know its going to work before investing in yet another video card. UPDATE I purchased a Nvidia GeForce GTS 450, plugged it in, downloaded the proprietary driver from the system control panel, disabled the onboard audio from the BIOS (not sure if this was necessary but I did it anyways), and changed the sound settings to use the new video card. Everything works flawlessly. It was a seemless setup.

    Read the article

  • xdebug 2.2.1 installed but not working with cgi

    - by ts01
    I've installed (via pecl) xdebug. It is installed (as phpinfo() output indicates), but it doesn't seems to work with CGI (with CLI it works). I've restarted apache, without result. Any ideas? Some config details (as parsed by http://xdebug.org/wizard.php) Xdebug installed: 2.2.1 Server API: Apache 2.0 Handler Windows: no Zend Server: no PHP Version: 5.3.10-1 Zend API nr: 220090626 PHP API nr: 20090626 Debug Build: no Thread Safe Build: no Configuration File Path: /etc/php5/apache2 Configuration File: /etc/php5/apache2/php.ini Extensions directory: /usr/lib/php5/20090626+lfs

    Read the article

  • Why is Web SQL database deprecated?

    - by user221287
    I am making a hybrid Android app. At first I decided to use localStorage, after spending 2 days, I realized that it is very strange and so dropped it. Then, I picked up indexedDB, after spending today's whole day and actually getting the output in Google Chrome, it is not running inside a WebView of the android app. And I never used Web SQL database at all because it was deprecated. Anyhow, it has come to my notice that PhoneGap still uses Web SQL and android's browsers support it. Why was Web SQL deprecated in the first place? And will it be a good idea for me to go with Web SQL now?

    Read the article

  • disable intel gpu in ubuntu 12.04

    - by small_potato
    I am wondering if there is anything to disable the intel gpu on ubuntu 12.04. I want to be able to setup dual monitor using nvidia-settings. It seems the intel gpu is used for display as suggested by sudo lshw -c display the output is *-display description: VGA compatible controller product: NVIDIA Corporation vendor: NVIDIA Corporation physical id: 0 bus info: pci@0000:01:00.0 version: a1 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress vga_controller bus_master cap_list rom configuration: driver=nvidia latency=0 resources: irq:16 memory:c0000000-c0ffffff memory:90000000-9fffffff memory:a0000000-a1ffffff ioport:4000(size=128) memory:a2000000-a207ffff *-display description: VGA compatible controller product: Haswell Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 06 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:47 memory:c2000000-c23fffff memory:b0000000-bfffffff ioport:5000(size=64) I have a lenovoY410 with GT750M. It seems there is no way to turn off the intel gpu in bios either. Help please. Thanks.

    Read the article

  • Switching to an external display, when primary is broken

    - by Shazzner
    I've successfully install Ubuntu 11.10 Desktop (x86) on an old(er) laptop that unfortunately has a broken screen, so there is an external monitor plugged in. On the livecd it come up on the secondary display just fine and I was able to install ubuntu and everything. Unfortunately when I reboot into Ubuntu proper now, the secondary display is off and I'm literally driving blind here trying to switch it to the secondary display. Using Nvidia open source drivers. Things I've tried: Rebooting back into livecd, mounting the partition and trying in vain to find a config file (it uses the open source drivers so no Xorg.conf I could edit manually) Trying to blind-type xrandr settings into what I hope is terminal: xrandr --output VGA1 --auto (nothing happened) Trying to blind install openssh-server so I could ssh into it and maybe configure it from my working computer. For some reason though, no luck. Ubuntu really should default to expanding to all screens for this use case.

    Read the article

  • Upgrade tree to 1.6?

    - by Pureferret
    I'm trying to upgrade my version of tree to 1.6 on ubuntu 12.04. I've d'loaded, ran make and make install in the terminal using the sudo command. ~/tree-1.6.0$ sudo make make: Nothing to be done for `all'. I've already run sudo make here ~/tree-1.6.0$ sudo make install install -d /usr/bin install -d /usr/man/man1 if [ -e tree ]; then \ install -s tree /usr/bin/tree; \ fi install doc/tree.1 /usr/man/man1/tree.1 What's this output though? It's not updated. I've checked the man page, and -du doesn't work. How am I supposed to update tree if not via the terminal?

    Read the article

  • The Orchard Project Planting a Few Seeds

    Orchard is a free, open source, community-focused project aimed at delivering applications and reusable components on the ASP.NET platform. The broad vision of the orchard project is to grow the ASP.NET open source community and partner with existing application authors to help them achieve their goal. The intended output of the Orchard project is three-fold: Individual .NET-based applications that appeal to end-users, scripters, and developers A set of re-usable components that...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How to implement custom texture formats in Android?

    - by random1337
    What I know: Android can load PNG, BMP, WEBP,... via BitmapFactory. What I want to achive: Load my own 2D file format (e.g. 1-bit texture with a 1-bit alpha channel) and output a RGBA8888 texture. Question: Is there any interface to achieve this?(or any other way) The resulting image is used as a texture for a 3D model. Why would you do that? Saving phone memory and download bandwidth while expanding the texture at runtime to RAM seems reasonable for very simple textures.

    Read the article

  • How much data validation is too much? [closed]

    - by adbertram
    Possible Duplicate: Data input validation - Where? How much? I'm a new PHP developer and am into Powershell quite a bit but this question is language agnostic. I've been questioning my code quite a bit lately thinking about how many nets I should setup to catch exceptions, verify results, etc. I realize that I could go crazy trying to verify each and every line of code but at the same time I want the code as resilient as possible. I'm not talking about user input but verifying output from methods. Is there some standard or rule of thumb to go by when deciding when and where to do data validation?

    Read the article

  • Screen Resolution Problem with Ubuntu 14.04 and VirtualBox

    - by user3341257
    Environment: Lenovo T530 running Windows 7. Have installed Ubuntu 14.04 on a virtual machine using VM Virtual Box. Have installed all the updates from both Virtual Box and Ubuntu. Problem: While in Ubuntu's desktop and other Ubuntu initiated programs, the window is reduced to about 3x4 inches showing in the middle of the rest of my regular Virtual Box window. I am seeing only the upper right hand of the screen output of what I would normally see. Please help. I've seen How do I install Guest Additions in a VirtualBox VM? But, none of these answers works in 14.04.

    Read the article

  • Bad sound quality and headphones not working

    - by wifi
    Using Ubuntu 10.10, on a HP Pavilion t3019.es, which has a Realtek ALC880 soundcard. It has 6 rear jack outputs, plus digital audio input and output, plus 3 front jacks (mic, headphones and a blue one which i don't know what's for). The sound on my computer is very low, and when i raise the volume up to 50%, it starts sounding distorted, crackling. Also, the headphones don't work when i plug them (it just keeps on playing through the speakers). I tried to comment the "/etc/modprobe.d/alsa-base.conf" file according to the soundcard and jacks in my computer, but none of the lines added worked (naturally, didn't added them at once). I found out that adding "options snd-hda-intel model=generic" to it made the sound better, but it's not as good as in Windows yet. Any ideas? Other than setting the PCM value, didn't work for me. Thanks.

    Read the article

  • Tweaking Hudson memory usage

    - by rovarghe
    Hudson 3.1 has some performance optimizations that greatly reduces its memory footprint. Prior to this Hudson used to always hold the entire data model (all jobs and all builds) in memory which affected scalability. Some installations configured heap sizes in excess of 1GB to counteract this. Hudson 3.1.x maintains an MRU cache and only loads jobs and builds as they are required. Because of the inability to change existing APIs and be backward compatible with plugins, there were limits to how far we could go with this approach. Memory optimizations almost always come with a related cost, in this case its additional I/O that has to be performed to load data on request. On a small site that has frequent traffic, this is usually not noticeable since the MRU cache will usually hold on to all the data. A large site with infrequent traffic might experience some delays when the first request hits the server after a long gap. If you have a large heap and are able to allocate more memory, the cache settings can be adjusted to take advantage of this and even go back to pre-3.1 behavior. All the cache settings can be passed as options to the JVM container (Tomcat or the default Jetty container) using the -D option. There are two caches, independant of each other, one for Jobs and the other for Builds. For the jobs cache: hudson.jobs.cache.evict_in_seconds ( default=60 ) Seconds from last access (could be because of a servlet request or a background cron thread) a job should be purged from the cache. Set this to 0 to never purge based on time. hudson.jobs.cache.initial_capacity ( default=1024 ) Initial number of jobs the cache can accomodate. Setting this to the number of jobs you typically display on your Hudson landing page or home page will speed up consecutive access to that page. If the default is too large you may consider downsizing and using that memory for the Builds cache instead. hudson.jobs.cache.max_entries ( default=1024) Maximum number of jobs in the cache. The default is large enough for most installations, but if you find I/O activity when always accessing the hudson home page you might consider increasing this, but first verify if the I/O is caused by frequent eviction (see above), rather than by the cache not being large enough. For the builds cache: The builds cache is used to store Build objects as they are read from storage. Typically this happens when a user drills down into the details of a particular Job from the hudson hom epage. The cache is shared among builds for different jobs since in most installations all jobs are not accessed with the same frequency, so a per-job builds cache would be a waste of memory. hudson.job.builds.cache.evict_in_seconds ( default=60 ) Same as the equivalent Job cache, applied to Build. hudson.job.builds.cache.initial_capacity" ( default=512 ) Same as equivalent Job cache setting. Note the smaller initial size. If your site stores a large number of builds and has frequent access to more builds you might consider bumping this up. hudson.job.builds.cache.max_entries ( default=10240 ) The default max is large enough for most installations, the builds cache has bigger sized objects, so be careful about increasing the upper limit on this. See section on monitoring below. Sample usage: java -jar hudson-war-3.1.2-SNAPSHOT.war -Dhudson.jobs.cache.evict_in_seconds=300 \ -Dhudson.job.builds.cache.evict_in_seconds=300 Monitoring cache usage The 'jmap' tool that comes with the JDK can be used to monitor cache performance in an indirect way by looking at the number of Job and Build objects in each cache. Find the PID of the hudson instance and run $ jmap -histo:live <pid | grep 'hudson.model.*Lazy.*Key$' Here's a sample output: num #instances #bytes class name 523: 28 896 hudson.model.RunMap$LazyRunValue$Key 1200: 3 96 hudson.model.LazyTopLevelItem$Key These are the keys to the Jobs (LazyTopLevelItem$Key) and Builds (RunMap$LazyRunValue$Key) in the caches, so counting the number of keys is a good indicator of the number of items in the cache at any given moment. The size in bytes can be ignored, they are just the size of the keys, not the actual sizes of the objects they hold. Those sizes can only be obtained with a profiler. With the output above we can conclude that there are 3 jobs and 28 builds in memory. The 28 builds can all be from 1 job or all 3 jobs. Over time on an idle system, these should get evicted and memory cache should be empty. In practice, because of background cron threads and triggers, jobs rarely fall down to zero. Access of a job or a build by a cron thread resets the eviction timer.

    Read the article

  • does unused vertices in a 3D object affect performance?

    - by Gajet
    For my game I need to generate a mesh dynamically. now I'm wondering does it have a noticeable affect in fps if I allocate more vertices than what I'm actually using or not? and does it matter if I'm using DirectX or OpenGL? edit final output will be a w*h cell grid, but for technical issues it's much more easier for me to allocate (w+1)*(h+1) vertices. sure I'll only use w*h vertices in indexing, and I know there is some memory wasting there, but I want to know if it also affect fps or not? (note that mesh is only generated once in each time you play the game)

    Read the article

  • Problem with the screen resolution on Ubuntu 12.04

    - by sveinn
    I just installed ubuntu on my laptop. The screen resolution is stuck in 1024x768. The screen is made for 1280x800. When I run xrandr I get: xrandr: Failed to get size of gamma for output default Screen 0: minimum 800 x 600, current 1024 x 768, maximum 1024 x 768 default connected 1024x768+0+0 0mm x 0mm 1024x768 61.0* 800x600 61.0 1280x800 isn't offered and I get gamma size error. I was going to look into the Xorg.conf file but I couldn't locate it. 1280x800 was displayed in Windows 7 and I think it is being displayed in Grub before ubuntu starts also. Here are some details about my computer: CPU Intel atom D2500 1.86GHz Chipset Intel 945GSE+ICH7M LCD 14" TFT 16:9 Resolution ratio 1280*800 Video Card Intel integrated GMA950 Does anyone know how to fix this?

    Read the article

< Previous Page | 436 437 438 439 440 441 442 443 444 445 446 447  | Next Page >