Search Results

Search found 415 results on 17 pages for 'chip sprague'.

Page 11/17 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • 12.10 visual performance using nvidia driver

    - by user100485
    My fresh ubuntu 12.10 install is slow, not something extreme but dragging windows, switching workspaces and things like that are just slow and look horrible. it feels like the fps is dropping in a game. Doing some photoshop work in windows was even a relief! This effect gets worse if I connect my external monitor. My system is an intel pentium dual core T4500 with 4gb memory and a GeForce 8200M G/integrated/SSE2 graphics chip. Nothing fancy but should be able to run ok. My "experience" in ubuntu is set to standard. (MSI cr500 laptop) I've installed the nvidia drivers, tried current and experimental and the experimental drivers seem to perform a bit better but overall bad anyway. I set the mode to adaptive in the nvidia-settings tool and it goes to maximum setting directly and doesn't come back. Using htop I found out that compiz or the X server always use a few percent of my cpu, more than I think it should and the time consumed is 5:18 for compiz, 4:33 for /usr/bin/X and 2:41 for google chrome(about 30 tabs open so not too strange I think.) What can I do to increase the visual performance cause this makes me not want to use ubuntu in public!

    Read the article

  • Hardware compatibility on H97 chipset/hardware support

    - by user3238850
    I am aware that there is documentation about compatibility but it is way out dated. I am also aware that there is a hardware compatibility page on Ubuntu website, but that one is focused on the whole box rather than a single piece of hardware. I have some experience with Linux OS, and some experience playing Ubuntu Server in a virtual machine, but never worked on a machine that lives in the real internet. I am building a home server with an Intel H97 chipset motherboard. I have looked at several models and none of them has Linux in the supported OS category. I have the experience of installing Ubuntu Desktop 14.04 on my 4-years-old lap top, and except for some system errors on start up, there is not too much I can complain about, so I guess I should be fine. However, this time I am going to install Ubuntu Server 14.04 on a relatively new piece of hardware(I went to http://linux-drivers.org/ but found nothing really helpful). For example the ASUS motherboard has M.2 socket and Intel LAN I218V chip, the Gigabyte motherboard has two LAN chips(Intel LAN WGI217V and ATHEROS AR8161-BL3A-R). So I really want to make sure everything will work. Usually I would just trust Ubuntu and buy all hardware I need, but basing on my past experience with the Ubuntu Desktop version on my lap top, I am not so convinced. There is an easily noticeable difference: when the system is idle, the fan runs much more frequently and longer under Ubuntu. This leads to my suspicion that generally hardware will have worse support for Ubuntu, which is no surprising at all but enough for me to put this post here. And as far as I know, some Intel CPU features come with software that usually will not run under Linux. Any help, idea or thoughts would be greatly appreciated!

    Read the article

  • wifi not working hp dv4

    - by Blaze
    i get this output for sudo lshw -C network : *-network description: Ethernet interface product: RTL8111/8168B PCI Express Gigabit Ethernet controller vendor: Realtek Semiconductor Co., Ltd. physical id: 0 bus info: pci@0000:08:00.0 logical name: eth0 version: 06 serial: 3c:4a:92:cd:63:98 size: 10Mbit/s capacity: 1Gbit/s width: 64 bits clock: 33MHz capabilities: pm msi pciexpress msix vpd bus_master cap_list ethernet physical tp mii 10bt 10bt-fd 100bt 100bt-fd 1000bt 1000bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=r8169 driverversion=2.3LK-NAPI duplex=half firmware=N/A latency=0 link=no multicast=yes port=MII speed=10Mbit/s resources: irq:42 ioport:3000(size=256) memory:c0404000-c0404fff memory:c0400000-c0403fff *-network DISABLED description: Wireless interface product: RT5390 [802.11 b/g/n 1T1R G-band PCI Express Single Chip] vendor: Ralink corp. physical id: 0 bus info: pci@0000:0a:00.0 logical name: wlan0 version: 00 serial: 90:00:4e:82:3c:5b width: 32 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=rt2800pci driverversion=3.0.0-17-generic-pae firmware=0.34 latency=0 link=no multicast=yes wireless=IEEE 802.11bgn resources: irq:17 memory:c2500000-c250ffff i cant enable the wifi using the fn+f12 key.. its just stuck. In windows 7 it gets enabled when i press the same keys. but this happens after it gets booted up... i just want to use the wifi in anyway possible. can anybody help?

    Read the article

  • My Android phone isn't being detected by Ubuntu

    - by Lara
    This is what I've got from terminal, looks like it can see the phone as a USB device just fine but isn't showing up under fdisk so I can't mount it. It automounts just fine in my VMWare Windows. And Internet tethering works fine while under Linux (haven't tried under Windows). Here's lsusb: Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 004 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub Bus 001 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub Bus 002 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub Bus 002 Device 019: ID 04d9:1135 Holtek Semiconductor, Inc. Bus 001 Device 003: ID 05ca:18c0 Ricoh Co., Ltd Bus 001 Device 004: ID 0489:e00f Foxconn / Hon Hai Foxconn T77H114 BCM2070 [Single-Chip Bluetooth 2.1 + EDR Adapter] Bus 003 Device 010: ID 04e8:6860 Samsung Electronics Co., Ltd Bus 002 Device 012: ID 046d:c315 Logitech, Inc. Classic New Touch Keyboard And here's sudo fdisk -l: Disk /dev/sda: 500.1 GB, 500107862016 bytes 255 heads, 63 sectors/track, 60801 cylinders, total 976773168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x0001ff06 Device Boot Start End Blocks Id System /dev/sda1 2048 681845797 340921875 7 HPFS/NTFS/exFAT /dev/sda2 * 681846784 845686783 81920000 83 Linux /dev/sda3 845686784 968566783 61440000 7 HPFS/NTFS/exFAT /dev/sda4 968568830 972475081 1953126 5 Extended /dev/sda5 968568832 972475081 1953125 82 Linux swap / Solaris Disk /dev/mapper/cryptswap1: 2000 MB, 2000000000 bytes 255 heads, 63 sectors/track, 243 cylinders, total 3906250 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0xbe4c2ec7 Disk /dev/mapper/cryptswap1 doesn't contain a valid partition table

    Read the article

  • Does Hauppauge WinTV HVR-900 (r2) [USB ID 2040:6502] work with ubuntu 12.04 LTS?

    - by nightfly
    I have this DVB+Analog usb tv tuner Hauppauge WinTV HVR-900 (r2) [USB ID 2040:6502]. This used to work under ubuntu 10.04 LTS. But in 12.04 there seems to be a problem. I have linux-firmware-nonfree and ivtv-utils installed. I am running Ubuntu 12.04.1 LTS 64 bit with all updates installed and the default unity environment. When I run mplayer tv:// -tv driver=v4l2:device=/dev/video1:input=1:norm=PAL I get a solid green screen and no picture. Here input 1 is the composite input of the card. MPlayer svn r34540 (Ubuntu), built with gcc-4.6 (C) 2000-2012 MPlayer Team mplayer: could not connect to socket mplayer: No such file or directory Failed to open LIRC support. You will not be able to use your remote control. Playing tv://. TV file format detected. Selected driver: v4l2 name: Video 4 Linux 2 input author: Martin Olschewski comment: first try, more to come ;-) Selected device: Hauppauge WinTV HVR 900 (R2) Tuner cap: Tuner rxs: Capabilities: video capture VBI capture device tuner audio read/write streaming supported norms: 0 = NTSC; 1 = NTSC-M; 2 = NTSC-M-JP; 3 = NTSC-M-KR; 4 = NTSC-443; 5 = PAL; 6 = PAL-BG; 7 = PAL-H; 8 = PAL-I; 9 = PAL-DK; 10 = PAL-M; 11 = PAL-N; 12 = PAL-Nc; 13 = PAL-60; 14 = SECAM; 15 = SECAM-B; 16 = SECAM-G; 17 = SECAM-H; 18 = SECAM-DK; 19 = SECAM-L; 20 = SECAM-Lc; inputs: 0 = Television; 1 = Composite1; 2 = S-Video; Current input: 1 Current format: YUYV v4l2: current audio mode is : MONO v4l2: ioctl set format failed: Invalid argument v4l2: ioctl set format failed: Invalid argument v4l2: ioctl set format failed: Invalid argument v4l2: ioctl query control failed: Invalid argument v4l2: ioctl query control failed: Invalid argument v4l2: ioctl query control failed: Invalid argument v4l2: ioctl query control failed: Invalid argument Failed to open VDPAU backend libvdpau_nvidia.so: cannot open shared object file: No such file or directory [vdpau] Error when calling vdp_device_create_x11: 1 ========================================================================== Opening video decoder: [raw] RAW Uncompressed Video Movie-Aspect is undefined - no prescaling applied. VO: [xv] 640x480 = 640x480 Packed YUY2 Selected video codec: [rawyuy2] vfm: raw (RAW YUY2) ========================================================================== Audio: no sound Starting playback... v4l2: select timeout V: 0.0 2/ 2 ??% ??% ??,?% 0 0 v4l2: select timeout V: 0.0 4/ 4 ??% ??% ??,?% 0 0 v4l2: select timeout V: 0.0 6/ 6 ??% ??% ??,?% 0 0 v4l2: select timeout v4l2: 0 frames successfully processed, 1 frames dropped. Exiting... (Quit) Here is the dmesg of the card when plugged in.. [12742.228097] usb 1-4: new high-speed USB device number 3 using ehci_hcd [12742.367289] em28xx: New device WinTV HVR-900 @ 480 Mbps (2040:6502, interface 0, class 0) [12742.367296] em28xx: Audio Vendor Class interface 0 found [12742.367585] em28xx #0: chip ID is em2882/em2883 [12742.550086] em28xx #0: i2c eeprom 00: 1a eb 67 95 40 20 02 65 d0 12 5c 03 82 1e 6a 18 [12742.550104] em28xx #0: i2c eeprom 10: 00 00 24 57 66 07 01 00 00 00 00 00 00 00 00 00 [12742.550120] em28xx #0: i2c eeprom 20: 46 00 01 00 f0 10 02 00 b8 00 00 00 5b e0 00 00 [12742.550135] em28xx #0: i2c eeprom 30: 00 00 20 40 20 6e 02 20 10 01 01 01 00 00 00 00 [12742.550150] em28xx #0: i2c eeprom 40: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 [12742.550165] em28xx #0: i2c eeprom 50: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 [12742.550181] em28xx #0: i2c eeprom 60: 00 00 00 00 00 00 00 00 00 00 18 03 34 00 30 00 [12742.550196] em28xx #0: i2c eeprom 70: 32 00 37 00 38 00 32 00 33 00 39 00 30 00 31 00 [12742.550211] em28xx #0: i2c eeprom 80: 00 00 1e 03 57 00 69 00 6e 00 54 00 56 00 20 00 [12742.550226] em28xx #0: i2c eeprom 90: 48 00 56 00 52 00 2d 00 39 00 30 00 30 00 00 00 [12742.550241] em28xx #0: i2c eeprom a0: 84 12 00 00 05 50 1a 7f d4 78 23 fa fd d0 28 89 [12742.550257] em28xx #0: i2c eeprom b0: ff 00 00 00 04 84 0a 00 01 01 20 77 00 40 1d b7 [12742.550272] em28xx #0: i2c eeprom c0: 13 f0 74 02 01 00 01 79 63 00 00 00 00 00 00 00 [12742.550287] em28xx #0: i2c eeprom d0: 84 12 00 00 05 50 1a 7f d4 78 23 fa fd d0 28 89 [12742.550302] em28xx #0: i2c eeprom e0: ff 00 00 00 04 84 0a 00 01 01 20 77 00 40 1d b7 [12742.550317] em28xx #0: i2c eeprom f0: 13 f0 74 02 01 00 01 79 63 00 00 00 00 00 00 00 [12742.550334] em28xx #0: EEPROM ID= 0x9567eb1a, EEPROM hash = 0x2bbf3bdd [12742.550338] em28xx #0: EEPROM info: [12742.550340] em28xx #0: AC97 audio (5 sample rates) [12742.550343] em28xx #0: 500mA max power [12742.550346] em28xx #0: Table at 0x24, strings=0x1e82, 0x186a, 0x0000 [12742.552590] em28xx #0: Identified as Hauppauge WinTV HVR 900 (R2) (card=18) [12742.555516] tveeprom 15-0050: Hauppauge model 65018, rev B2C0, serial# 1292061 [12742.555523] tveeprom 15-0050: tuner model is Xceive XC3028 (idx 120, type 71) [12742.555529] tveeprom 15-0050: TV standards PAL(B/G) PAL(I) PAL(D/D1/K) ATSC/DVB Digital (eeprom 0xd4) [12742.555534] tveeprom 15-0050: audio processor is None (idx 0) [12742.555537] tveeprom 15-0050: has radio [12742.570297] tuner 15-0061: Tuner -1 found with type(s) Radio TV. [12742.570327] xc2028 15-0061: creating new instance [12742.570332] xc2028 15-0061: type set to XCeive xc2028/xc3028 tuner [12742.573685] xc2028 15-0061: Loading 80 firmware images from xc3028-v27.fw, type: xc2028 firmware, ver 2.7 [12742.624056] xc2028 15-0061: Loading firmware for type=BASE MTS (5), id 0000000000000000. [12744.126591] xc2028 15-0061: Loading firmware for type=MTS (4), id 000000000000b700. [12744.153586] xc2028 15-0061: Loading SCODE for type=MTS LCD NOGD MONO IF SCODE HAS_IF_4500 (6002b004), id 000000000000b700. [12744.280963] Registered IR keymap rc-hauppauge [12744.281151] input: em28xx IR (em28xx #0) as /devices/pci0000:00/0000:00:1a.7/usb1/1-4/rc/rc1/input10 [12744.281541] rc1: em28xx IR (em28xx #0) as /devices/pci0000:00/0000:00:1a.7/usb1/1-4/rc/rc1 [12744.282454] em28xx #0: Config register raw data: 0xd0 [12744.284709] em28xx #0: AC97 vendor ID = 0xffffffff [12744.285829] em28xx #0: AC97 features = 0x6a90 [12744.285832] em28xx #0: Empia 202 AC97 audio processor detected [12744.359211] em28xx #0: v4l2 driver version 0.1.3 [12744.404066] xc2028 15-0061: Loading firmware for type=BASE F8MHZ MTS (7), id 0000000000000000. [12745.915089] MTS (4), id 00000000000000ff: [12745.915100] xc2028 15-0061: Loading firmware for type=MTS (4), id 0000000100000007. [12746.161668] em28xx #0: V4L2 video device registered as video1 [12746.161673] em28xx #0: V4L2 VBI device registered as vbi0 [12746.162845] em28xx-audio.c: probing for em28xx Audio Vendor Class [12746.162848] em28xx-audio.c: Copyright (C) 2006 Markus Rechberger [12746.162851] em28xx-audio.c: Copyright (C) 2007-2011 Mauro Carvalho Chehab [12746.221099] xc2028 15-0061: attaching existing instance [12746.221105] xc2028 15-0061: type set to XCeive xc2028/xc3028 tuner [12746.221109] em28xx #0: em28xx #0/2: xc3028 attached [12746.221113] DVB: registering new adapter (em28xx #0) [12746.221118] DVB: registering adapter 0 frontend 0 (Micronas DRXD DVB-T)... [12746.221869] em28xx #0: Successfully loaded em28xx-dvb [13111.196055] xc2028 15-0061: Loading firmware for type=BASE F8MHZ MTS (7), id 0000000000000000. [13112.720062] MTS (4), id 00000000000000ff: [13112.720072] xc2028 15-0061: Loading firmware for type=MTS (4), id 0000000100000007. [13214.956057] xc2028 15-0061: Loading firmware for type=BASE F8MHZ MTS (7), id 0000000000000000. [13216.479806] MTS (4), id 00000000000000ff: [13216.479816] xc2028 15-0061: Loading firmware for type=MTS (4), id 0000000100000007. [13276.408056] xc2028 15-0061: Loading firmware for type=BASE F8MHZ MTS (7), id 0000000000000000. [13277.932093] MTS (4), id 00000000000000ff: [13277.932104] xc2028 15-0061: Loading firmware for type=MTS (4), id 0000000100000007. [13305.032076] xc2028 15-0061: Loading firmware for type=BASE F8MHZ MTS (7), id 0000000000000000. [13306.556449] MTS (4), id 00000000000000ff: [13306.556460] xc2028 15-0061: Loading firmware for type=MTS (4), id 0000000100000007. [13392.236055] xc2028 15-0061: Loading firmware for type=BASE F8MHZ MTS (7), id 0000000000000000. [13393.760123] MTS (4), id 00000000000000ff: [13393.760133] xc2028 15-0061: Loading firmware for type=MTS (4), id 0000000100000007. [13637.534053] usb 1-4: USB disconnect, device number 3 [13637.534183] em28xx #0: disconnecting em28xx #0 video [13637.560214] em28xx #0: V4L2 device vbi0 deregistered [13637.560335] em28xx #0: V4L2 device video1 deregistered [13637.561237] xc2028 15-0061: destroying instance [13639.772120] usb 1-4: new high-speed USB device number 4 using ehci_hcd [13639.911351] em28xx: New device WinTV HVR-900 @ 480 Mbps (2040:6502, interface 0, class 0) [13639.911357] em28xx: Audio Vendor Class interface 0 found [13639.911637] em28xx #0: chip ID is em2882/em2883 [13640.094262] em28xx #0: i2c eeprom 00: 1a eb 67 95 40 20 02 65 d0 12 5c 03 82 1e 6a 18 [13640.094280] em28xx #0: i2c eeprom 10: 00 00 24 57 66 07 01 00 00 00 00 00 00 00 00 00 [13640.094295] em28xx #0: i2c eeprom 20: 46 00 01 00 f0 10 02 00 b8 00 00 00 5b e0 00 00 [13640.094311] em28xx #0: i2c eeprom 30: 00 00 20 40 20 6e 02 20 10 01 01 01 00 00 00 00 [13640.094326] em28xx #0: i2c eeprom 40: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 [13640.094341] em28xx #0: i2c eeprom 50: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 [13640.094356] em28xx #0: i2c eeprom 60: 00 00 00 00 00 00 00 00 00 00 18 03 34 00 30 00 [13640.094371] em28xx #0: i2c eeprom 70: 32 00 37 00 38 00 32 00 33 00 39 00 30 00 31 00 [13640.094386] em28xx #0: i2c eeprom 80: 00 00 1e 03 57 00 69 00 6e 00 54 00 56 00 20 00 [13640.094401] em28xx #0: i2c eeprom 90: 48 00 56 00 52 00 2d 00 39 00 30 00 30 00 00 00 [13640.094416] em28xx #0: i2c eeprom a0: 84 12 00 00 05 50 1a 7f d4 78 23 fa fd d0 28 89 [13640.094432] em28xx #0: i2c eeprom b0: ff 00 00 00 04 84 0a 00 01 01 20 77 00 40 1d b7 [13640.094447] em28xx #0: i2c eeprom c0: 13 f0 74 02 01 00 01 79 63 00 00 00 00 00 00 00 [13640.094462] em28xx #0: i2c eeprom d0: 84 12 00 00 05 50 1a 7f d4 78 23 fa fd d0 28 89 [13640.094477] em28xx #0: i2c eeprom e0: ff 00 00 00 04 84 0a 00 01 01 20 77 00 40 1d b7 [13640.094492] em28xx #0: i2c eeprom f0: 13 f0 74 02 01 00 01 79 63 00 00 00 00 00 00 00 [13640.094509] em28xx #0: EEPROM ID= 0x9567eb1a, EEPROM hash = 0x2bbf3bdd [13640.094512] em28xx #0: EEPROM info: [13640.094515] em28xx #0: AC97 audio (5 sample rates) [13640.094517] em28xx #0: 500mA max power [13640.094521] em28xx #0: Table at 0x24, strings=0x1e82, 0x186a, 0x0000 [13640.097391] em28xx #0: Identified as Hauppauge WinTV HVR 900 (R2) (card=18) [13640.099617] tveeprom 15-0050: Hauppauge model 65018, rev B2C0, serial# 1292061 [13640.099623] tveeprom 15-0050: tuner model is Xceive XC3028 (idx 120, type 71) [13640.099629] tveeprom 15-0050: TV standards PAL(B/G) PAL(I) PAL(D/D1/K) ATSC/DVB Digital (eeprom 0xd4) [13640.099634] tveeprom 15-0050: audio processor is None (idx 0) [13640.099637] tveeprom 15-0050: has radio [13640.112849] tuner 15-0061: Tuner -1 found with type(s) Radio TV. [13640.112877] xc2028 15-0061: creating new instance [13640.112882] xc2028 15-0061: type set to XCeive xc2028/xc3028 tuner [13640.115930] xc2028 15-0061: Loading 80 firmware images from xc3028-v27.fw, type: xc2028 firmware, ver 2.7 [13640.164057] xc2028 15-0061: Loading firmware for type=BASE MTS (5), id 0000000000000000. [13641.666643] xc2028 15-0061: Loading firmware for type=MTS (4), id 000000000000b700. [13641.693262] xc2028 15-0061: Loading SCODE for type=MTS LCD NOGD MONO IF SCODE HAS_IF_4500 (6002b004), id 000000000000b700. [13641.820765] Registered IR keymap rc-hauppauge [13641.820958] input: em28xx IR (em28xx #0) as /devices/pci0000:00/0000:00:1a.7/usb1/1-4/rc/rc2/input11 [13641.821335] rc2: em28xx IR (em28xx #0) as /devices/pci0000:00/0000:00:1a.7/usb1/1-4/rc/rc2 [13641.822256] em28xx #0: Config register raw data: 0xd0 [13641.824526] em28xx #0: AC97 vendor ID = 0xffffffff [13641.825503] em28xx #0: AC97 features = 0x6a90 [13641.825507] em28xx #0: Empia 202 AC97 audio processor detected [13641.899015] em28xx #0: v4l2 driver version 0.1.3 [13641.944064] xc2028 15-0061: Loading firmware for type=BASE F8MHZ MTS (7), id 0000000000000000. [13643.470765] MTS (4), id 00000000000000ff: [13643.470776] xc2028 15-0061: Loading firmware for type=MTS (4), id 0000000100000007. [13643.717713] em28xx #0: V4L2 video device registered as video1 [13643.717718] em28xx #0: V4L2 VBI device registered as vbi0 [13643.718770] em28xx-audio.c: probing for em28xx Audio Vendor Class [13643.718775] em28xx-audio.c: Copyright (C) 2006 Markus Rechberger [13643.718778] em28xx-audio.c: Copyright (C) 2007-2011 Mauro Carvalho Chehab [13643.777148] xc2028 15-0061: attaching existing instance [13643.777154] xc2028 15-0061: type set to XCeive xc2028/xc3028 tuner [13643.777158] em28xx #0: em28xx #0/2: xc3028 attached [13643.777162] DVB: registering new adapter (em28xx #0) [13643.777167] DVB: registering adapter 0 frontend 0 (Micronas DRXD DVB-T)... [13643.777876] em28xx #0: Successfully loaded em28xx-dvb And here goes the lsmod output lsmod|grep em28xx em28xx_dvb 18579 0 dvb_core 110619 1 em28xx_dvb em28xx_alsa 18305 0 em28xx 109365 2 em28xx_dvb,em28xx_alsa v4l2_common 16454 3 tuner,tvp5150,em28xx videobuf_vmalloc 13589 1 em28xx videobuf_core 26390 2 em28xx,videobuf_vmalloc rc_core 26412 10 rc_hauppauge,ir_lirc_codec,ir_mce_kbd_decoder,ir_sony_decoder,ir_jvc_decoder,ir_rc6_decoder,ir_rc5_decoder,em28xx,ir_nec_decoder snd_pcm 97188 3 em28xx_alsa,snd_hda_intel,snd_hda_codec tveeprom 21249 1 em28xx videodev 98259 5 tuner,tvp5150,em28xx,v4l2_common,uvcvideo snd 78855 14 em28xx_alsa,snd_hda_codec_conexant,snd_hda_intel,snd_hda_codec,snd_hwdep,snd_pcm,snd_rawmidi,snd_seq,snd_timer,snd_seq_device Isn't this driver mainline now? Or this card is not supported? Or the analog functionality is screwed? I need the analog capture working for this card. Please help!

    Read the article

  • Firmware/driver for Broadcom wifi card, PowerBook G4 running Ubuntu 12.04 [duplicate]

    - by user107831
    This question already has an answer here: How to Install Broadcom Wireless Drivers 40 answers This is my first time working with Ubuntu (or Linux), so please be patient. I am running Ubuntu 12.04-powerpc "Precise Pangolin" on a Mac PowerBook G4 with 1.67GHz processor. The firmware/driver for the wifi card is missing. For reasons not worth explaining, I cannot physically plug the computer into the network. I have another computer, a MacBook Pro running OSX, from which I can download files and port them by USB thumb drive. The wifi card in the PowerBook G4 is by Broadcom. The chip is BCM4306, rev. 3. The PCI number is 14e4:4302. I have downloaded b43-fwcutter_015-14_powerpc.deb and dropped it into the Home folder on the Ubuntu machine. However, it will not install. When I double-click, it opens with Ubuntu SoftwareCenter, but the "Install" button is inactive: I can't click it. There's a message beside the inactive button saying, "An older version of 'b43-fwcutter' is available in your normal software channels. Only install this file if you trust the origin." If I "right-click" the .deb file and open with Archive Manager, it shows me the "DEBIAN" and "usr" folders, but I'm unsure what to do from there...and fairly certain this is not the right way to do things. Maybe I have the wrong version of b43-fwcutter for my machine/version of Ubuntu? The documentation for this problem is a mess. It refers to all sorts of out-of-date Ubuntu versions and to an array of different "cutter" and firmware files. Maybe I'd be able to figure this out if I were a more seasoned Ubuntu user, but I have no idea why Sofware Center won't let me do the install. I would be VERY grateful for an explanation of how to get the wifi card working on this machine again. Thank you!

    Read the article

  • Are Intel compilers really better than Microsoft ones?

    - by Rocket Surgeon
    Years ago I was surprised when discovered that Intel sells Studio compatible compilers. I tried it in particular for C/C++ as well as fantastic diagnostic tools. But the code was simply not that computationally intensive to notice the difference. The only impression was: did Intel really did it for me just now, Wow, amazing tools with nanoseconds resolution, unbeleivable. But the trial ended and team never seriously considered a purchase. From your experience, if license cost does not matter, which vendor is a winner ? It is not broad or vague question or attemt to spark a holy war. This sort of question about 2 very visible tools. Nobody likes when tools have any mysteries or surprises. And choices between best and best are always the pain. I also understand the "grass greener" argument. I want to hear all "what ifs" stories. What if Intel just locally optimizes it for the chip stepping of the month, and not every hardware target will actually work as well as Microsoft compiled ? What if AMD hardware is the target and everything will slow down for no reason ? Or on other hand, what if Intel's hardware has so many unnoticable opportunities, that Microsoft compiler writers are too slow to adopt and never implement in the compiler ? What if both are the same exactly, actually a single codebase just wrapped into 2 different boxes and licensed to both vendors by some 3rd party shop? And so on. But someone knows some answers.

    Read the article

  • Problem with SANE and CardScan

    - by TiuTalk
    I have a CardScan 60 II device and installed SANE in my Ubuntu 10.10 laptop. The problem is I can't make scanimage find the device. Quote: $ sudo sane-find-scanner # sane-find-scanner will now attempt to detect your scanner. If the # result is different from what you expected, first make sure your # scanner is powered up and properly connected to your computer. # No SCSI scanners found. If you expected something different, make sure that # you have loaded a kernel SCSI driver for your SCSI adapter. found USB scanner (vendor=0x08f0 [Corex Technologies Corporation], product=0x1000 [Corex CardScan 60], chip=LM9832/3) at libusb:006:002 # Your USB scanner was (probably) detected. It may or may not be supported by # SANE. Try scanimage -L and read the backend's manpage. # Not checking for parallel port scanners. # Most Scanners connected to the parallel port or other proprietary ports # can't be detected by this program. But I can't find the device: $ sudo scanimage -L No scanners were identified. If you were expecting something different, check that the scanner is plugged in, turned on and detected by the sane-find-scanner tool (if appropriate). Please read the documentation which came with this software (README, FAQ, manpages).

    Read the article

  • recent unreliable wireless connection on 10.04 and 10.10

    - by gabkdlly
    Recently, my internet connection over wireless has become unreliable, on both a Dell laptop running Ubuntu 10.04 as well as my Desktop running Ubuntu 10.10 . The problem does not seem to occur on a laptop running Windows Vista. The problem does not seem to occur on my Openmoko Freerunner ( running Android 1.5 ), though I hardly ever use this device to connect over WLAN, so the problem may have just slipped by. This problem does not seem to appear when I boot into Ubuntu 9.10 from a live CD ( more precisely, I was able to ping fu-berlin.de for an hour without any packet loss ). Under Ubuntu 10.10, I am experiencing about 33% packet loss. On my main Ubuntu Desktop, I have tried the following wireless devices: a Longshine PCI card ( an old device with an RTL8180L chip ) a D-Link DWL-510 PCI card ( this device threw warnings in dmesg ) a USB device from MSI ( US54EX ). Usually my wireless network shows up in the network manager with a normal signal strength, even when the connection speed is slow ( which happens often ) or the connection gets reset ( asking me to click connect to re-authenticate my wireless connection ). I have observed this problem with a Netgear KWGR614 Router ( with the manufacturers firmware ), as well as with a TP-LINK TL-WR741ND router running OpenWrt. Taking a look at my routers logs, I find many instances of the following line: Tuesday,04 Jan 2011 03:53:01 [TCP SYN Flood][Deny access policy matched, dropping packet] I know that the Netgear router is susceptible to denial of service attacks, as I have previously been able to disrupt its operation by putting an nmap scan into a while loop. I use WEP on the Netgear router and WPA on the TP-LINK to encrypt the wireless connections. Is it possible that someone is jamming my signal ?

    Read the article

  • Why am I not getting an sRGB default framebuffer?

    - by Aaron Rotenberg
    I'm trying to make my OpenGL Haskell program gamma correct by making appropriate use of sRGB framebuffers and textures, but I'm running into issues making the default framebuffer sRGB. Consider the following Haskell program, compiled for 32-bit Windows using GHC and linked against 32-bit freeglut: import Foreign.Marshal.Alloc(alloca) import Foreign.Ptr(Ptr) import Foreign.Storable(Storable, peek) import Graphics.Rendering.OpenGL.Raw import qualified Graphics.UI.GLUT as GLUT import Graphics.UI.GLUT(($=)) main :: IO () main = do (_progName, _args) <- GLUT.getArgsAndInitialize GLUT.initialDisplayMode $= [GLUT.SRGBMode] _window <- GLUT.createWindow "sRGB Test" -- To prove that I actually have freeglut working correctly. -- This will fail at runtime under classic GLUT. GLUT.closeCallback $= Just (return ()) glEnable gl_FRAMEBUFFER_SRGB colorEncoding <- allocaOut $ glGetFramebufferAttachmentParameteriv gl_FRAMEBUFFER gl_FRONT_LEFT gl_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING print colorEncoding allocaOut :: Storable a => (Ptr a -> IO b) -> IO a allocaOut f = alloca $ \ptr -> do f ptr peek ptr On my desktop (Windows 8 64-bit with a GeForce GTX 760 graphics card) this program outputs 9729, a.k.a. gl_LINEAR, indicating that the default framebuffer is using linear color space, even though I explicitly requested an sRGB window. This is reflected in the rendering results of the actual program I'm trying to write - everything looks washed out because my linear color values aren't being converted to sRGB before being written to the framebuffer. On the other hand, on my laptop (Windows 7 64-bit with an Intel graphics chip), the program prints 0 (huh?) and I get an sRGB default framebuffer by default whether I request one or not! And on both machines, if I manually create a non-default framebuffer bound to an sRGB texture, the program correctly prints 35904, a.k.a. gl_SRGB. Why am I getting different results on different hardware? Am I doing something wrong? How can I get an sRGB framebuffer consistently on all hardware and target OSes?

    Read the article

  • Advancing my Embedded knowledge.....with a CS degree.

    - by Mercfh
    So I graduated last December with a B.S. in Computer Science, in a pretty good well known engineering college. However towards the end I realized that I actually like Assembly/Lower level C programming more than I actually enjoy higher level abstracted OO stuff. (Like I Programmed my own Device Drivers for USB stuff in Linux, stuff like that) But.....I mean we really didn't concentrate much on that in college, perhaps an EE/CE degree would've been better, but I knew the classes......and things weren't THAT much different. I've messed around with Atmel AVR's/Arduino stuff (Mostly robotics) and Linux Kernals/Device Drivers. but I really want to enhance my skills and maybe one day get a job doing embedded stuff. (I have a job now, it's An entry level software dev/tester job, it's a good job but not exactly what my passion lies in) (Im pretty good with C and certain ASM's for specific microcontrollers) Is this even possible with a CS degree? or am I screwed? (since technically my degree usually doesn't involve much embedded stuff) If Im NOT screwed then what should I be studying/learning? How would I even go about it........ I guess I could eventually say "Experienced with XXXX Microcontrollers/ASM/etc...." but still, it wouldn't be the same as having a CE/EE degree. Also....going back to college isn't an option. just fyi. edit: Any book recommendations for "getting used to this stuff" I have ARM System-on-Chip Architecture (2nd edition) it's good.....for ARM stuff lol

    Read the article

  • Blank pale blue screen with Live USB Kubuntu on AMD Sempron 2800+ processor

    - by WGCman
    I am trying to install Kubuntu onto a USB stick to use on my Acer Aspire 1362 laptop with an AMD Sempron 2800+ chip. Using Windows XP, I downloaded and saved to the laptop's hard drive: kubuntu-2.04.1-desktop-i386.iso from the GetKubuntu website and LinuxLive USB Creator 2.8.16.exe from the Linux live website I then installed the latter and ran it, installing the kubuntu onto the Memory stick. Leaving the Bios setup unchanged, the USB stick is ignored and Windows boots. If I change the Bios boot order so that the memory stick takes precedence, I see a dark blue screen announcing Kubukntu 12.04, and on selecting either “live Mode” or “Persistent mode”, messages flash by quickly, some of which appear to be error messages, including “trying to unpack rootfs image as initramfs”, “cannot allocate resource for mainboard”, “no plug and play device found”. Eventually I see a pale blue screen with four moving dots announcing Kubukntu 12.04, similar to the login screen of my Kubuntu desktop, but no invitation to log in or indeed any dialog. After several minutes, this changes to a black screen with more messages including “no caching mode present”, “ADDRCONF(NETDEV_UP): wlan0: link is not ready”, then degrades to a blank pale blue screen which can only be moved by switching the computer off. Finding no way to log the error messages passing by, I managed to photograph most of them, but know no way to attach the photo to this forum. As suggested by User 68186 (to whom thanks!), I have edited my original post to reflect the recent progress, so the following two comments are now superseded.

    Read the article

  • Raspberry Pi and Java SE: A Platform for the Masses

    - by Jim Connors
    One of the more exciting developments in the embedded systems world has been the announcement and availability of the Raspberry Pi, a very capable computer that is no bigger than a credit card.  At $35 US, initial demand for the device was so significant, that very long back orders quickly ensued. After months of patiently waiting, mine finally arrived.  Those initial growing pains appear to have been fixed, so availability now should be much more reasonable. At a very high level, here are some of the important specs: Broadcom BCM2835 System on a chip (SoC) ARM1176JZFS, with floating point, running at 700MHz Videocore 4 GPU capable of BluRay quality playback 256Mb RAM 2 USB ports and Ethernet Boots from SD card Linux distributions (e.g. Debian) available So what's taking place taking place with respect to the Java platform and Raspberry Pi? A Java SE Embedded binary suitable for the Raspberry Pi is available for download (Arm v6/7) here.  Note, this is based on the armel architecture, a variety of Arm designed to support floating point through a compatibility library that operates on more platforms, but can hamper performance.  In order to use this Java SE binary, select the available Debian distribution for your Raspberry Pi. The more recent Raspbian distribution is based on the armhf (hard float) architecture, which provides for more efficient hardware-based floating point operations.  However armhf is not binary compatible with armel.  As of the writing of this blog, Java SE Embedded binaries are not yet publicly available for the armhf-based Raspbian distro, but as mentioned in Henrik Stahl's blog, an armhf release is in the works. As demonstrated at the just-completed JavaOne 2012 San Francisco event, the graphics processing unit inside the Raspberry Pi is very capable indeed, and makes for an excellent candidate for JavaFX.  As such, plans also call for a Pi-optimized version of JavaFX in a future release too. A thriving community around the Raspberry Pi has developed at light speed, and as evidenced by the packed attendance at Pi-specific sessions at Java One 2012, the interest in Java for this platform is following suit. So stay tuned for more developments...

    Read the article

  • Problems with Intel Video Resolution on Acer Laptop Wide Display

    - by ricstr
    I have an ACER Aspire 5332 laptop which I have just installed Ubuntu 12.04 x64, which is causing some issues with the video display on boot and video resolution. First and foremost, it will only boot past the purple screen if GRUB has been edited to replace 'quick splash' with 'nomodeset'. Secondly, once it has booted with the the 'nomodeset' option, it does not allow me to change the resolution higher or lower from 1024 x 786. Is it OK to use the 'nomodeset' for normal use? Will this compromise performance of other devices? The video card is an on-board one, integrated within the Intel GL40 chip-set. The display is a wide-screen LCD, and under Windows could operate under various resolutions. Ideally I would like it to operate on a resolution to fit the wide-screen display as it a bit stretched out at the moment, and less desktop space as I am used to. I believe the optimal resolution is 1366 x 768. Below is some information from the terminal which may be useful. ricstr@Aspire-5332:~$ lspci | grep -i VGA 00:02.0 VGA compatible controller: Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller (rev 09) ricstr@Aspire-5332:~$ xrandr xrandr: Failed to get size of gamma for output default Screen 0: minimum 1024 x 768, current 1024 x 768, maximum 1024 x 768 default connected 1024x768+0+0 0mm x 0mm 1024x768 0.0*

    Read the article

  • Screen resolution stuck at 1024x768

    - by Dananjaya
    I just updated from Ubuntu 10.10 to 11.04 and have an issue regarding the screen resolution. I have Intel integrated gfx chip and my monitor supports resolutions larger than 1024x768. (in 10.10 I've been using 1280x1024) But as soon as I upgraded, I'm stuck with 1024x768 resolution and seems I can't change it. running xrandr In terminal yields the following results, Screen 0: minimum 320 x 200, current 1024 x 768, maximum 4096 x 4096 LVDS1 connected 1024x768+0+0 (normal left inverted right x axis y axis) 0mm x 0mm 1280x800 58.1 + 1024x768 60.0* 800x600 60.3 56.2 640x480 59.9 VGA1 connected 1024x768+0+0 (normal left inverted right x axis y axis) 344mm x 194mm 1366x768 59.9 + 1360x768 60.0 1024x768 75.1 72.0 70.1 60.0* 832x624 74.6 800x600 72.2 75.0 60.3 56.2 640x480 72.8 75.0 66.7 60.0 720x400 70.1 1280x1024_60.00 (0xce) 109.0MHz h: width 1280 start 1368 end 1496 total 1712 skew 0 clock 63.7KHz v: height 1024 start 1027 end 1034 total 1063 clock 59.9Hz What maybe the problem? Is it a bug? What kind of steps I should take in order to get a higher resolution? (changing xorg.conf maybe?) Any insight is highly appreciated. Thanks in advance. UPDATE Screenshot after running xrandr --addmode VGA1 1360x768 As you can see, side bar is not completely visible and Ubuntu logo at the task bar is missing. Also when you open an application, the Task bar of the application (where it should go to the top panel) is missing as well..

    Read the article

  • Are Intel compilers really better than the Microsoft ones?

    - by Rocket Surgeon
    Years ago, I was surprised when I discovered that Intel sells Visual Studio compatible compilers. I tried it in particular for C/C++ as well as fantastic diagnostic tools. But the code was simply not that computationally intensive to notice the difference. The only impression was: did Intel really do it for me just now, wow, amazing tools with nanoseconds resolution, unbelievable. But the trial ended and the team never seriously considered a purchase. From your experience, if license cost does not matter, which vendor is the winner? It is not a broad or vague question or attemt to spark a holy war. This sort of question is about two very visible tools. Nobody likes when tools have any mysteries or surprises. And choices between best and best are always the pain. I also understand the grass is always greener argument. I want to hear all "what ifs" stories. What if Intel just locally optimizes it for the chip stepping of the month, and not every hardware target will actually work as well as Microsoft compiled? What if AMD hardware is the target and everything will slow down for no reason? Or on the other hand, what if Intel's hardware has so many unnoticable opportunities, that Microsoft compiler writers are too slow to adopt and never implement it in the compiler? What if both are the same exactly, actually a single codebase just wrapped into two different boxes and licensed to both vendors by some third-party shop? And so on. But someone knows some answers.

    Read the article

  • Automated testing tool development challenges (for embedded software)

    - by Karthi prime
    My boss want to come up with the proposal for the following tool: An IDE: Able to build, compile, debug, via JTAG programming for the micro-controller. A Test Suite, reads the code in the IDE, auto generates the test cases, and it gives the in-target unit testing results(which is done by controlling code execution in the micro-controller via IDE). A no-overhead code coverage tool which interacts with the test suite and IDE. My work is to obtain the high level architecture of this tool, so as to proceed further. My current knowledge: There are tool-chains available from the chip manufacturer for the micro-controllers which can be utilized along with an open-source IDE like Eclipse, and along with an open-source burner, a complete IDE for a micro-controller can be done. Test cases can be auto-generated by reading the source file through the process of parsing, scripting, based on keywords. Test suite must be able to command the IDE to control, through breakpoints, and read the register contents from the microcontroller - This enables the in-target unit testing. An no-overhead code coverage should be done by no-overhead code instrumentation so as to execute those in the resource constraint environment of the micro-controller. I have the following questions: Any advice on the validity of my understanding? What are the challenges I will have during the development? What are the helpful open-source tools regarding this? What is the development time for this software? Thanks

    Read the article

  • Lots of static/crackling noises after ALSA HDA DKMS installation

    - by MartinB
    I am using a Samsung Chronos 7 laptop with the following sound setup: $ head -n 1 /proc/asound/card0/codec* ==> /proc/asound/card0/codec#0 <== Codec: Realtek ALC269VC ==> /proc/asound/card0/codec#3 <== Codec: Intel CougarPoint HDMI With the stock ALSA that comes with Ubuntu 12.04, I do not get any sound out of the headphones when I plug them into the headphone jack. After plugging the headphones in, I have to manually use Alsamixer to increase the volume, so that the headphones become usable. I have been told that this issue is due to my sound chip not being supported in the ALSA version that ships with Precise. A similar question at AskUbuntu and the Ubuntu Community Documentation point me to the ALSA DKMS installation. After installing the dkms module of yesterday's ALSA snapshot and rebooting, the headphone issue is indeed solved. I can now plug my headphones into the jack and instantly have sound on them. However, now I have tons of static noises and crackling when playing sound in VLC player or Skype (Firefox HTML5 playback seems to be fine, unless a Skype sound interferes with it). Is there a fix for this? I tried adding the Alsa PPA and installing the latest ALSA package proper, but that didn't have any effect, only the Alsa DKMS package seems to solve the headphone issue.

    Read the article

  • recent unreliable wireless connection with Netgear KWGR614 router

    - by gabkdlly
    Recently, my internet connection over wireless ( via a Netgear KWGR614 router ) has become unreliable, on both a Dell laptop running Ubuntu 10.04 as well as my Desktop running Ubuntu 10.10 . The problem does not seem to occur on a laptop running Windows Vista, nor on a Desktop running Windows 7 ( this machine is connected with an ethernet cable ). The problem does not seem to occur on my Openmoko Freerunner ( running Android 1.5 ), though I hardly ever use this device to connect over WLAN, so the problem may have just slipped by. On my main Ubuntu Desktop, I have tried the following wireless devices: a Longshine PCI card ( an old device with an RTL8180L chip ) a D-Link DWL-510 PCI card ( this device threw warnings in dmesg ) a USB device from MSI ( US54EX ). Usually my wireless network shows up in the network manager with a normal signal strength, even when the connection speed is slow or the connection gets reset ( asking me to click connect to re-authenticate my wireless connection ). I know that this router is susceptible to denial of service attacks, as I have previously been able to disrupt its operation by putting an nmap scan into a while loop. Is it possible that someone is jamming my signal ?

    Read the article

  • Audio comes out of both headphone and speaker at the same time.. Ubuntu 12.04LTS [closed]

    - by pst007x
    I have the same issue on an Aspire. Ubuntu 12.04LTS 64bit realtek audio sound chip onboard If I plug in a headset, audio does not switch from internal speaker to headset, instead plays out of both at the same time. I have looked at the alsamixer setting, all on. I installed gnome-alsamixer, and I noticed headphone was ticked, if I untick the main audio mutes, and the headphone no longer works. Headset only works with internal speaker. Audio works fine on my other desktop and laptop running this release 00:1b.0 Audio device: Intel Corporation 82801I (ICH9 Family) HD Audio Controller (rev 03) salvatore@salvatore-Aspire-7730:~$ cat /proc/asound/version Advanced Linux Sound Architecture Driver Version 1.0.24. salvatore@salvatore-Aspire-7730:~$ head -n 1 /proc/asound/card*/codec#* ==> /proc/asound/card0/codec#0 <== Codec: Realtek ALC888 ==> /proc/asound/card0/codec#1 <== Codec: LSI ID 1040 ==> /proc/asound/card0/codec#2 <== Codec: Intel Cantiga HDMI salvatore@salvatore-Aspire-7730:~$ aplay -l **** List of PLAYBACK Hardware Devices **** card 0: Intel [HDA Intel], device 0: ALC888 Analog [ALC888 Analog] Subdevices: 1/1 Subdevice #0: subdevice #0 card 0: Intel [HDA Intel], device 1: ALC888 Digital [ALC888 Digital] Subdevices: 1/1 Subdevice #0: subdevice #0 card 0: Intel [HDA Intel], device 3: HDMI 0 [HDMI 0] Subdevices: 1/1 Subdevice #0: subdevice #0 salvatore@salvatore-Aspire-7730:~$ uname -a Linux salvatore-Aspire-7730 3.2.0-23-generic #36-Ubuntu SMP Tue Apr 10 20:39:51 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux salvatore@salvatore-Aspire-7730:~$ The alsa-base.conf does not exist Tried this: sudo apt-get remove --purge alsa-base sudo apt-get remove --purge pulseaudio sudo apt-get install alsa-base sudo apt-get install pulseaudio sudo alsa force-reload Then: sudo apt-get purge pulseaudio gstreamer0.10-pulseaudio sudo apt-get install pulseaudio gstreamer0.10-pulseaudio indicator-sound Tred this. sudo gedit Then open terminal: sudo /etc/modprobe.d/alsa-base.conf At the end of the file add a new line: options snd-hda-intel model=generic Save and then reboot But alsa-base.conf does not exist

    Read the article

  • Effect of using dedicated NVidia card instead of Intel HD4000

    - by Sman789
    Short version: Can someone please advise me of the effect of adding a dedicated NVIDIA GeForce GT 630M card to an Ubuntu laptop in terms of power consumption and performance gains/losses when doing general productivity tasks and booting up. Also, how good are the closed source, open source, and Bumblebee drivers for these newer cards compared to support for the Intel HD4000? Long version/Background, if any info here is helpful: I'm thinking of ordering a laptop from PC Specialist (a UK company who actually sell machines without Windows pre-installed) with the following specifications: Genesis IV: 15.6" AUO Matte 95% Gamut LED Widescreen (1920x1080) Intel® Core™i5 Dual Core Mobile Processor i5-3210M (2.50GHz) 3MB 4GB SAMSUNG 1600MHz SODIMM DDR3 MEMORY (1 x 4GB) 120GB INTEL® 520 SERIES SSD, SATA 6 Gb/s (upto 550MB/sR | 520MB/sW) Intel 2 Channel High Definition Audio + MIC/Headphone Jack GIGABIT LAN & WIRELESS INTEL® N135 802.11N (150Mbps) + BLUETOOTH Now, as I want this laptop mainly for work and not for games, I would be more than content with the HD4000 integrated chip which comes with the processor. However, for compatibility reasons, I am not able to get the specs I want unless I choose a NVIDIA GeForce GT 630M 1GB graphics card, which I don't have a great deal of use for. I'm willing to buy it, however, as it's still cheaper than any other laptop with the specs I want. However, I know that Linux power management isn't fantastic with open-source graphics drivers, and I don't much about Bumblebee. Basically, whilst I'm happy to 'tolerate' the card being there, I don't want to experience any negative effects on the rest of my system (battery, performance etc) and if there are likely to be any, I might reconsider my purchase. So if anyone can advise me on the effects, I would be very grateful, since I doubt I can just turn the card off. Thankyou for any assistance :)

    Read the article

  • Why does working processors harder use more electrical power?

    - by GazTheDestroyer
    Back in the mists of time when I started coding, at least as far as I'm aware, processors all used a fixed amount of power. There was no such thing as a processor being "idle". These days there are all sorts of technologies for reducing power usage when the processor is not very busy, mostly by dynamically reducing the clock rate. My question is why does running at a lower clock rate use less power? My mental picture of a processor is of a reference voltage (say 5V) representing a binary 1, and 0V representing 0. Therefore I tend to think of of a constant 5V being applied across the entire chip, with the various logic gates disconnecting this voltage when "off", meaning a constant amount of power is being used. The rate at which these gates are turned on and off seems to have no relation to the power used. I have no doubt this is a hopelessly naive picture, but I am no electrical engineer. Can someone explain what's really going on with frequency scaling, and how it saves power. Are there any other ways that a processor uses more or less power depending on state? eg Does it use more power if more gates are open? How are mobile / low power processors different from their desktop cousins? Are they just simpler (less transistors?), or is there some other fundamental design difference?

    Read the article

  • recent unreliable wireless connection

    - by gabkdlly
    Recently, my internet connection over wireless ( via a Netgear KWGR614 router ) has become unreliable, on both a Dell laptop running Ubuntu 10.04 as well as my Desktop running Ubuntu 10.10 . The problem does not seem to occur on a laptop running Windows Vista, nor on a Desktop running Windows 7 ( this machine is connected with an ethernet cable ). The problem does not seem to occur on my Openmoko Freerunner ( running Android 1.5 ), though I hardly ever use this device to connect over WLAN, so the problem may have just slipped by. On my main Ubuntu Desktop, I have tried the following wireless devices: a Longshine PCI card ( an old device with an RTL8180L chip ) a D-Link DWL-510 PCI card ( this device threw warnings in dmesg ) a USB device from MSI ( US54EX ). Usually my wireless network shows up in the network manager with a normal signal strength, even when the connection speed is slow or the connection gets reset ( asking me to click connect to re-authenticate my wireless connection ). I have observed this problem with a Netgear KWGR614 Router ( with the manufacturers firmware ), as well as with a TP-LINK TL-WR741ND router running OpenWrt. Taking a look at my routers logs, I find many instances of the following line: Tuesday,04 Jan 2011 03:53:01 [TCP SYN Flood][Deny access policy matched, dropping packet] I know that the Netgear router is susceptible to denial of service attacks, as I have previously been able to disrupt its operation by putting an nmap scan into a while loop. I use WEP or WPA to encrypt the wireless network. Is it possible that someone is jamming my signal ?

    Read the article

  • No MAU required on a T4

    - by jsavit
    Cryptic background One of the powerful features of the T-series servers is its hardware crypto acceleration, which dramatically speeds up the compute intensive algorithms needed to encrypt and decrypt data. Previously, administrators setting up logical domains on older T-series servers had to explicitly assign crypto resources (called "MAU" for historical reasons from the T1 chip that had "modular arithmetic units") to domains that had a significant crypto workload (say, an SSL based web server). This could be an administrative burden, as you had to choose which domains got the crypto units, and issue the appropriate ldm set-mau N mydomain commands. The T4 changes things The T4 is fast. Really fast. Its clock rate and out-of-order (OOO) execution that provides the single-thread performance that T-series machines previously did not have. If you have any preconceptions about T-series performance, or SPARC in general, based on the older servers (which, it must be said, were absolutely outstanding for multi-threaded applications), those assumptions are now obsolete. The T4 provides outstanding. performance for all kinds of workload, as illustrated at https://blogs.oracle.com/bestperf. While we all focused on this (did I mention the T4 is fast?), another feature of the T4 went largely unnoticed: The T4 servers have crypto acceleration "just built in" so administrators no longer have to assign crypto accelerator units to domains - it "just happens". This is way way better since you have crypto everywhere by default without having to manage it like a discrete and limited resource. It's a feature of the processor, like doing an integer add. With T4, there is no management necessary, you just have HW crypto everywhere all the time seamlessly. This change hasn't been widely advertised, and some administrators have wondered why there were unable to assign a MAU to a domain as they did with T2 and T3 machines. The answer is that there is no longer any separate MAU, so you don't have to take any action at all - just leave the default of 0. Summary Besides being much faster than its predecessors, the T4 also integrates hardware crypto acceleration so its seamlessly available to applications, whether domains are being used or not. Administrators no longer have to control how they are allocated - it "just happens"

    Read the article

  • Will proprietary software-based sound enhancements work with Ubuntu? (BeatsAudio, Dolby)

    - by LiveWireBT
    This question is targeted at mainstream or gamer-grade software-based audio/sound enhancements, found in highly integrated computing and entertainment systems like laptops, tablets and smartphones. These are mostly marketed with fancy badges of known audio-releated brands on the product or packaging, while being mostly uncertain about the actual implementation or components used and poorly differentiated from the general audio capabilities of the system or device. This question is not about actual hardware like speakers. If your headphones are not properly detected, your speakers are assigned wrong, work partially or not at all then your soundcard or chip is not properly detected and you should take a look at troubleshooting audio issues. This question is also not about enthusiast or recording-grade hardware like recording interfaces, amplifiers and DACs in a variety of formfactors. And this question is also not about audio encoding and playback of different audio formats like Dolby Digital, Dolby TrueHD and DTS. Most of these may be subject to patents and licensing, see restricted formats. If you are just searching for an equalizer, please take a look at this question: Is there any Sound enhancers/equalizer? Simply speaking: Every feature where you would flip a switch or check a box in a fancy looking interface in Windows that makes the sound change from neutral to fancy.

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17  | Next Page >