Search Results

Search found 18084 results on 724 pages for 'graphics programming'.

Page 10/724 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Win32 and Win64 programming in C sources?

    - by Nick Rosencrantz
    I'm learning OpenGL with C and that makes me include the windows.h file in my project. I'd like to look at some more specific windows functions and I wonder if you can cite some good sources for learning the basics of Win32 and Win64 programming in C (or C++). I use MS Visual C++ and I prefer to stick with C even though much of the Windows API seems to be C++. I'd like my program to be portable and using some platform-indepedent graphics library like OpenGL I could make my program portable with some slight changes for window management. Could you direct me with some pointers to books or www links where I can find more info? I've already studied the OpenGL red book and the C programming language, what I'm looking for is the platform-dependent stuff and how to handle that since I run both Linux and Windows where I find the development environment Visual Studio is pretty good but the debugger gdb is not available on windows so it's a trade off which environment i'll choose in the end - Linux with gcc or Windows with MSVC. Here is the program that draws a graphics primitive with some use of windows.h This program is also runnable on Linux without changing the code that actually draws the graphics primitive: #include <windows.h> #include <gl/gl.h> LRESULT CALLBACK WindowProc(HWND, UINT, WPARAM, LPARAM); void EnableOpenGL(HWND hwnd, HDC*, HGLRC*); void DisableOpenGL(HWND, HDC, HGLRC); int WINAPI WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPSTR lpCmdLine, int nCmdShow) { WNDCLASSEX wcex; HWND hwnd; HDC hDC; HGLRC hRC; MSG msg; BOOL bQuit = FALSE; float theta = 0.0f; /* register window class */ wcex.cbSize = sizeof(WNDCLASSEX); wcex.style = CS_OWNDC; wcex.lpfnWndProc = WindowProc; wcex.cbClsExtra = 0; wcex.cbWndExtra = 0; wcex.hInstance = hInstance; wcex.hIcon = LoadIcon(NULL, IDI_APPLICATION); wcex.hCursor = LoadCursor(NULL, IDC_ARROW); wcex.hbrBackground = (HBRUSH)GetStockObject(BLACK_BRUSH); wcex.lpszMenuName = NULL; wcex.lpszClassName = "GLSample"; wcex.hIconSm = LoadIcon(NULL, IDI_APPLICATION);; if (!RegisterClassEx(&wcex)) return 0; /* create main window */ hwnd = CreateWindowEx(0, "GLSample", "OpenGL Sample", WS_OVERLAPPEDWINDOW, CW_USEDEFAULT, CW_USEDEFAULT, 256, 256, NULL, NULL, hInstance, NULL); ShowWindow(hwnd, nCmdShow); /* enable OpenGL for the window */ EnableOpenGL(hwnd, &hDC, &hRC); /* program main loop */ while (!bQuit) { /* check for messages */ if (PeekMessage(&msg, NULL, 0, 0, PM_REMOVE)) { /* handle or dispatch messages */ if (msg.message == WM_QUIT) { bQuit = TRUE; } else { TranslateMessage(&msg); DispatchMessage(&msg); } } else { /* OpenGL animation code goes here */ glClearColor(0.0f, 0.0f, 0.0f, 0.0f); glClear(GL_COLOR_BUFFER_BIT); glPushMatrix(); glRotatef(theta, 0.0f, 0.0f, 1.0f); glBegin(GL_TRIANGLES); glColor3f(1.0f, 0.0f, 0.0f); glVertex2f(0.0f, 1.0f); glColor3f(0.0f, 1.0f, 0.0f); glVertex2f(0.87f, -0.5f); glColor3f(0.0f, 0.0f, 1.0f); glVertex2f(-0.87f, -0.5f); glEnd(); glPopMatrix(); SwapBuffers(hDC); theta += 1.0f; Sleep (1); } } /* shutdown OpenGL */ DisableOpenGL(hwnd, hDC, hRC); /* destroy the window explicitly */ DestroyWindow(hwnd); return msg.wParam; } LRESULT CALLBACK WindowProc(HWND hwnd, UINT uMsg, WPARAM wParam, LPARAM lParam) { switch (uMsg) { case WM_CLOSE: PostQuitMessage(0); break; case WM_DESTROY: return 0; case WM_KEYDOWN: { switch (wParam) { case VK_ESCAPE: PostQuitMessage(0); break; } } break; default: return DefWindowProc(hwnd, uMsg, wParam, lParam); } return 0; } void EnableOpenGL(HWND hwnd, HDC* hDC, HGLRC* hRC) { PIXELFORMATDESCRIPTOR pfd; int iFormat; /* get the device context (DC) */ *hDC = GetDC(hwnd); /* set the pixel format for the DC */ ZeroMemory(&pfd, sizeof(pfd)); pfd.nSize = sizeof(pfd); pfd.nVersion = 1; pfd.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER; pfd.iPixelType = PFD_TYPE_RGBA; pfd.cColorBits = 24; pfd.cDepthBits = 16; pfd.iLayerType = PFD_MAIN_PLANE; iFormat = ChoosePixelFormat(*hDC, &pfd); SetPixelFormat(*hDC, iFormat, &pfd); /* create and enable the render context (RC) */ *hRC = wglCreateContext(*hDC); wglMakeCurrent(*hDC, *hRC); } void DisableOpenGL (HWND hwnd, HDC hDC, HGLRC hRC) { wglMakeCurrent(NULL, NULL); wglDeleteContext(hRC); ReleaseDC(hwnd, hDC); }

    Read the article

  • How difficult is it to change from Embedded programming to a high level programming [on hold]

    - by anudeep shetty
    I have a background in Computer Science. I worked on Embedded programming on Linux file systems, after I finished my Bachelor's degree, for over a year. After that I pursued my masters where most of my course choices involved working on web, java and databases. Now I have an offer to work with a company that is offering a job to work on the OS level. The company is pretty good but I am feeling that my Masters has gone to waste. I wanted to know is it common that a Computer Science major works on low-level coding and is there a possibility that I can work in this company for some years and then move onto an opportunity where I can work on high-level coding? Also is working on low-level programming a safe choice in terms of job opportunities?

    Read the article

  • Ways to earn money through programming and/or programming knowledge [closed]

    - by Jason Swett
    It occurred to me today that it might be useful to make a list of all the ways to earn money through either actual programming or just programming knowledge. I imagine it's probably a finite list as long as you stick to a reasonable level of granularity. Here's what I have so far: Trading your time for money (i.e. having a job or being a freelancer) Building your own software product (a full-fledged startup or a tiny mobile app or whatever) Giving talks at conferences and meetups Teaching students in a classroom Writing a book or blog (these are products, but non-software products) I've probably missed at least a few. What else is there? (I'm not sure whether this is an appropriate question, by the way. I think I would select the best answer based on how practical/original/interesting/numerous your suggestions are.)

    Read the article

  • New graphics card (GTX 760) slowing down entire PC

    - by Cayetano Gonçalves
    My new graphics card is making my PC totally unusable. It boots up really slowly, and when the windows screen comes on, the mouse lags really far behind. Nothing opens at a normal speed. However, when I put in my 5-year old graphics card, it all works fine. I'm currently using a Foxconn Renaissance LGA 1366 Intel X58 ATX Motherboard, Intel Motherboard Intel Core i7-950 Bloomfield 3.06GHz LGA 1366 130W Processor, and a EVGA SuperNOVA 850G2 80PLUS Gold Certified ATX12V/EPS12V 850W Power Supply . I know it can't be the power supply, because I just bought it today to try to fix the problem. I've also installed the newest version of BIOS available for my motherboard. I've also seen extreme variations in CPU while the new graphics card is in, and when the old graphics card is installed, it is much calmer. Any thoughts?

    Read the article

  • HP Envy 14, Ubuntu 10.10 and trouble with the graphics cards

    - by Carsten Gehling
    A few days ago I bought a HP Envy 14, containing 2 graphics card: An integrated Intel graphics card, and an ATI HD 5650. I've installed Ubuntu 10.10 32-bit on the machine. Most things work fine out of the box, but the graphics cards are giving me trouble. When booting, I get the message "failed to get i915 symbols, graphics turbo disabled". Then the screen blanks out during the remaining boot period. I am able to get the display working by changing to one of the consoles, then closing and opening the laptop's lid. It seems that Ubuntu gets confused about which card to use. I've read here: http://www.andreas-demmer.de/en/2010/07/18/testbericht-linux-auf-dem-hp-envy-14 that I should be able to turn off one the cards by echoing keywords into /sys/kernel/debug/vgaswitcheroo/switch, but that path is not available on my system. The BIOS does not have any methods to switch of the ATI card. Help anyone? /Carsten

    Read the article

  • HP Envy 14, Ubuntu 10.10 and trouble with the graphics cards

    - by Carsten Gehling
    A few days ago I bought a HP Envy 14, containing 2 graphics card: An integrated Intel graphics card, and an ATI HD 5650. I've installed Ubuntu 10.10 32-bit on the machine. Most things work fine out of the box, but the graphics cards are giving me trouble. When booting, I get the message "failed to get i915 symbols, graphics turbo disabled". Then the screen blanks out during the remaining boot period. I am able to get the display working by changing to one of the consoles, then closing and opening the laptop's lid. It seems that Ubuntu gets confused about which card to use. I've read here: http://www.andreas-demmer.de/en/2010/07/18/testbericht-linux-auf-dem-hp-envy-14 that I should be able to turn off one the cards by echoing keywords into /sys/kernel/debug/vgaswitcheroo/switch, but that path is not available on my system. The BIOS does not have any methods to switch of the ATI card. Help anyone? /Carsten

    Read the article

  • Intel hybrid graphics stuck in power save mode

    - by OZZIE
    I have an Asus UL30V with Intel hybrid graphics. When I bought it it used to switch to turn of the better graphics card to save power and run on a more power effecient one. When I connected the laptop to a power source again I could go to the intel graphics icon on the task bar and click on "switch to enhanced performance mode" or something similar. But like 2-3 months ago it suddenly vanished. I cannot switch modes anymore, it's gotten stuck in power save mode. I can change the windows power setting but that doesn't enable the better graphics card :( What should I do? I've tried restarting the computer and that does not work. I really hope I don't have to reinstall the drivers! It seems like other people have had the same issue: http://en.community.dell.com/support-forums/laptop/f/3518/p/19422695/19994601.aspx , he hasn't even got the same computer as me. I have windows 7 Home Premium x64 Any ideas?

    Read the article

  • Intermittent graphics issue

    - by Rob
    An older (~3 year old) PC I have passed on to my younger brother has a Sony 17" LCD monitor and uses on-board Intel GMA 3100 graphics (supports Aero under Windows 7 and no more is required). Recently he has complained to me twice that after booting up the screen looks all 'fuzzy'. Initially I thought maybe he messed up the ClearType settings by mistake, but resetting that did nothing. The only thing that finally worked was changing the resolution to something else temporarily, then setting it back to the original native (highest supported) resolution. The second time this happened the same 'fix' worked. This is happening only intermittently as of now and is not predictable/reproducible. I suspect either the monitor or on-board graphics is dying. How can I check and confirm which of the two it is (or maybe something else you guys can think of)? If it's the on-board graphics, would I be able to keep using the same motherboard with the addition of a cheap PCI-E graphics card?

    Read the article

  • Monitor has yellow tint when used with the graphics card

    - by artknish
    I have an nVIDIA graphics card when used produces an yellow tint on my monitor's display. I tried with another monitor and it had the same issue. I then switched back to my motherboard's inbuilt graphics output (not sure what the technical term is), and the display seems fine, except I can't get the optimal resolution of 1440x900 to work. So is my graphics card's life over? Or can I get it repaired? Any self remedies without calling a hardware guy? Should I try with a DVI cable? I've been using the VGA cable from my graphics card to the monitor so far. Thanks for your suggestions!

    Read the article

  • Graphics card for dual 2560 x 1440 output

    - by Bender
    As the title suggests I want to power two 2560 x 1440 monitors from one graphics card. I am just about to send my second graphics card back to Amazon, so before I purchase a 3rd incorrect card I thought I should get some advice. My latest card has dual DL-DVI output but only supports 2560 x 1440 on one output, the other maxes out at 1920 x 1080. Can graphics cards with two Dual Link DVI adapters (or alternately, one Dual Link DVI and one DisplayPort support the a full 2560 x 1440 on both monitors, or does that depend on the graphics hardware and not only the connectors?

    Read the article

  • Starting to make 2D games in C++

    - by Ashley
    I'm fairly experienced with C and C#, but I've only ever created console/windows applications. I'm also experienced with AS3 and I've made some flash games. I want to make proper 2D games in C++, but I have no idea where to begin with graphics. There are entire books devoted to game development in C++ that only work with console applications and I'm finding the lack of resources and tutorials for proper 2D games frustrating... I'm also not particularly interested in using existing engines because I want total control of what I create. I've heard of the Allegro library; is it something important/popular that I should look into? How will I use DirectX? Any resources or links to tutorials or information is greatly appreciated.

    Read the article

  • How to make "xrandr" work with GMA500?

    - by Nwbie
    Is it error at driver of graphic chip or Xorg or kernel? I am Asus T91mt with GMA500, Ubuntu 12.04.1. I would like too see only a notice of connection at least. A log of xrandr: $ lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation System Corporation System Controller Hub (SCH Poulcbo) Graphics Controller (rev 07) vp@vc:~$ xrandr --verbose xrandr: Failed to get size of gamma for output default Screen 0: minimum 1024 x 600, current 1024 x 600, maximum 1024 x 600 default connected 1024x600+0+0 (0x138) normal (normal) 0mm x 0mm Identifier: 0x137 Timestamp: 26863 Subpixel: unknown Clones: CRTC: 0 CRTCs: 0 Transform: 1.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 filter: 1024x600 (0x138) 0.0MHz *current h: width 1024 start 0 end 0 total 1024 skew 0 clock 0.0KHz v: height 600 start 0 end 0 total 600 clock 0.0Hz vp@vc:~$ xrandr --prop xrandr: Failed to get size of gamma for output default Screen 0: minimum 1024 x 600, current 1024 x 600, maximum 1024 x 600 default connected 1024x600+0+0 0mm x 0mm 1024x600 0.0* vp@vc:~$ Please help, i am linux newbie and i am tired ;/

    Read the article

  • gnome shell with very high CPU usage

    - by 501 - not implemented
    i'm running ubuntu gnome 13.10 on my dell latiude e6510 with a i5 m560. The I5 comes with a embedded Intel HD 3400 Graphics. The average cpu usage of the gnome-shell is by 160% it's to high, I think. Is there a problem with a driver? If i call the command glxinfo | grep OpenGL it returns: OpenGL vendor string: VMware, Inc. OpenGL renderer string: Gallium 0.4 on llvmpipe (LLVM 3.3, 128 bits) OpenGL version string: 2.1 Mesa 9.2.1 OpenGL shading language version string: 1.30 OpenGL extensions: Greetings

    Read the article

  • How come ASUS sells Ubuntu installed 1225Cs, when we dont have drivers for gma3600?

    - by sumit_gt
    I recently purchased a ASUS 1225C cedar trail netbook. It didn't come preinstalled with Ubuntu like it comes in some markets. I installed Ubuntu 12.04 LTS on it. While Ubuntu works just fine, I am bothered by the lack of proper graphics support because of intel gma 3600. I am aware that this is not a problem of Ubuntu. My question is this: How does ASUS manage to sell these netbooks in other markets with Ubuntu preinstalled? How does ASUS manage to get everything to work? And if they can, why aren't I able to do the same? link: For more on Asus 1225C As you can see, they show Ubuntu working perfectly in the screenshots. Is this false advertising?

    Read the article

  • Gnome 3 freezes on logon on samsung RV 509

    - by Noufal
    I have a Samsung NP-RV509 A0FIN and I tried to install GNU/Linux with gnome 3.2 on it. I tried Fedora 16, Ubuntu 11.10 and Linux Mint 12 RC, but with no success. All of these freezes upon login into gnome shell. I think it is the problem with graphics driver, so I tried xorg-edgers ppa on my last installation, ie., Linux Mint. I also tried various intel graphics packages listed on Synaptic package manager, but no success again. My device configuration is as follows(obtained from windows 7): More details about my computer Component Details Subscore Base score Processor Intel(R) Pentium(R) CPU P6200 @ 2.13GHz 5.6 4.6 Memory (RAM) 4.00 GB 7.2 Graphics Intel(R) HD Graphics 4.6 Gaming graphics 1562 MB Total available graphics memory 5.2 Primary hard disk 12GB Free (50GB Total) 5.9 Windows 7 Ultimate System -------------------------------------------------------------------------------- Manufacturer SAMSUNG ELECTRONICS CO., LTD. Model RV409/RV509/RV709 Total amount of system memory 4.00 GB RAM System type 32-bit operating system Number of processor cores 2 64-bit capable Yes Storage -------------------------------------------------------------------------------- Total size of hard disk(s) 418 GB Disk partition (C:) 12 GB Free (50 GB Total) Media drive (D:) CD/DVD Disk partition (E:) 526 MB Free (191 GB Total) Disk partition (F:) 101 GB Free (177 GB Total) Graphics -------------------------------------------------------------------------------- Display adapter type Intel(R) HD Graphics Total available graphics memory 1562 MB Dedicated graphics memory 64 MB Dedicated system memory 0 MB Shared system memory 1498 MB Display adapter driver version 8.15.10.2202 Primary monitor resolution 1366x768 DirectX version DirectX 10 Network -------------------------------------------------------------------------------- Network Adapter Realtek PCIe GBE Family Controller Network Adapter Broadcom 802.11n Network Adapter Network Adapter Microsoft Virtual WiFi Miniport Adapter Notes -------------------------------------------------------------------------------- The gaming graphics score is based on the primary graphics adapter. If this system has linked or multiple graphics adapters, some software applications may see additional performance benefits. Any help is appreciated, and thanks in advance.

    Read the article

  • Problem implementing Blinn–Phong shading model

    - by Joe Hopfgartner
    I did this very simple, perfectly working, implementation of Phong Relflection Model (There is no ambience implemented yet, but that doesn't bother me for now). The functions should be self explaining. /** * Implements the classic Phong illumination Model using a reflected light * vector. */ public class PhongIllumination implements IlluminationModel { @RGBParam(r = 0, g = 0, b = 0) public Vec3 ambient; @RGBParam(r = 1, g = 1, b = 1) public Vec3 diffuse; @RGBParam(r = 1, g = 1, b = 1) public Vec3 specular; @FloatParam(value = 20, min = 1, max = 200.0f) public float shininess; /* * Calculate the intensity of light reflected to the viewer . * * @param P = The surface position expressed in world coordinates. * * @param V = Normalized viewing vector from surface to eye in world * coordinates. * * @param N = Normalized normal vector at surface point in world * coordinates. * * @param surfaceColor = surfaceColor Color of the surface at the current * position. * * @param lights = The active light sources in the scene. * * @return Reflected light intensity I. */ public Vec3 shade(Vec3 P, Vec3 V, Vec3 N, Vec3 surfaceColor, Light lights[]) { Vec3 surfaceColordiffused = Vec3.mul(surfaceColor, diffuse); Vec3 totalintensity = new Vec3(0, 0, 0); for (int i = 0; i < lights.length; i++) { Vec3 L = lights[i].calcDirection(P); N = N.normalize(); V = V.normalize(); Vec3 R = Vec3.reflect(L, N); // reflection vector float diffuseLight = Vec3.dot(N, L); float specularLight = Vec3.dot(V, R); if (diffuseLight > 0) { totalintensity = Vec3.add(Vec3.mul(Vec3.mul( surfaceColordiffused, lights[i].calcIntensity(P)), diffuseLight), totalintensity); if (specularLight > 0) { Vec3 Il = lights[i].calcIntensity(P); Vec3 Ilincident = Vec3.mul(Il, Math.max(0.0f, Vec3 .dot(N, L))); Vec3 intensity = Vec3.mul(Vec3.mul(specular, Ilincident), (float) Math.pow(specularLight, shininess)); totalintensity = Vec3.add(totalintensity, intensity); } } } return totalintensity; } } Now i need to adapt it to become a Blinn-Phong illumination model I used the formulas from hearn and baker, followed pseudocodes and tried to implement it multiple times according to wikipedia articles in several languages but it never worked. I just get no specular reflections or they are so weak and/or are at the wrong place and/or have the wrong color. From the numerous wrong implementations I post some little code that already seems to be wrong. So I calculate my Half Way vector and my new specular light like so: Vec3 H = Vec3.mul(Vec3.add(L.normalize(), V), Vec3.add(L.normalize(), V).length()); float specularLight = Vec3.dot(H, N); With theese little changes it should already work (maby not with correct intensity but basically it should be correct). But the result is wrong. Here are two images. Left how it should render correctly and right how it renders. If i lower the shininess factor you can see a little specular light at the top right: Altough I understand the concept of Phong illumination and also the simplified more performant adaptaion of blinn phong I am trying around for days and just cant get it to work. Any help is appriciated. Edit: I was made aware of an error by this answer, that i am mutiplying by |L+V| instead of dividing by it when calculating H. I changed to deviding doing so: Vec3 H = Vec3.mul(Vec3.add(L.normalize(), V), 1/Vec3.add(L.normalize(), V).length()); Unfortunately this doesnt change much. The results look like this: and if I rise the specular constant and lower the shininess You can see the effects more clearly in a smilar wrong way: However this division just the normalisation. I think I am missing one step. Because the formulas like this just dont make sense to me. If you look at this picture: http://en.wikipedia.org/wiki/File:Blinn-Phong_vectors.svg The projection of H to N is far less than V to R. And if you imagine changing the vector V in the picture the angle is the same when the viewing vector is "on the left side". and becomes more and more different when going to the right. I pesonally would multiply the whole projection by two to become something similiar (and the hole point is to avoid the calculation of R). Altough I didnt read anythinga bout that anywehre i am gonna try this out... Result: The intension of the specular light is far too much (white areas) and the position is still wrong. I think I am messing something else up because teh reflection are just at the wrong place. But what? Edit: Now I read on wikipedia in the notes that the angle of N/H is in fact approximalty half or V/R. To compensate that i should multiply my shineness exponent by 4 rather than my projection. If i do that I end up with this: Far to intense but still one thing. The projection is at the wrong place. Where could i mess up my vectors?

    Read the article

  • Drawing path by Bezier curves

    - by OgreSwamp
    Hello. I have a task - draw smooth curve input: set of points (they added in realtime) current solution: I use each 4 points to draw qubic Bezier curve (1 - strart, 2 and 3rd - control points, 4- end). End point of each curve is start point for the next one. problem: at the curves connection I often have "fracture" (angle) Can you tell me, how to connect my points more smooth? Thanks!

    Read the article

  • can i get the font information from Graphics System.Drawing.Graphics in c#

    - by Bahgat Mashaly
    Hello i get the Graphics from Graphics g= System.Drawing.Graphics.FromHwnd(button1.Handle); can i get the font information from this Graphics i was try to get a font by using GetTextFace api function but it return "system" it mean default font in OS and i was try to use SendMessage(button1.Handle, WM_GETFONT, 0, 0); bu it return me 0 also it is mean default font in OS I have known the cause of the problem, it due to FlatStyle property See this link http://blogs.msdn.com/b/michkap/archive/2008/09/26/8965526.aspx thanks

    Read the article

  • Finding a new programming language for web development?

    - by Xeoncross
    I'm wondering if there are any un-biased resources that give good, specific overviews of programming languages and their intended goals. I would like to learn a new language, but visiting the sites of each language isn't working. Each one talks about how great it is without much mention of it's weaknesses or specific goals. Ruby is a dynamic, open source programming language with a focus on simplicity and productivity. Python is a programming language that lets you work more quickly and integrate your systems more effectively. Having been a PHP developer for years, Vic Cherubini sums up my plight well: I knew PHP well, had my own framework, and could work quickly to get something up and running. I programmed like this throughout the MVC revolution. I got better and better jobs (read: better paying, better title) as a PHP developer, but all along the way realizing that the code I wrote on my own time was great, and the code I worked with at work was horrible. Like, worse than horrible. Atrocious. OS Commerce level bad. Having side projects kept me sane, because the code I worked with at work made me miserable. This is why I'm retiring from PHP for my side projects and new programming ventures. I'm spent with PHP. Exhausted, if you will. I've reached a level where I think I'm at the top with it as a language and if I don't move on to a new language soon, I'll be done completely with programming and I do not want that. Languages I've looked at include JavaScript (for node.js), Ruby, Python, & Erlang. I've even thought about Scala or C++. The problem is figuring out which ones are built to handle my needs the best. So where can I go to skip the hype and get real information about the maturity of a platform, the size of the community, and the strengths & weaknesses of that language. If I know these then picking a language to continue my web development should be easy.

    Read the article

  • Strange Ubuntu Random Display [Video]

    - by d4v1dv00
    I had this random display issue ever since Ubuntu 11.04 and now running Ubuntu 11.10 and this problem still persist. It is very hard for me to explain, so I uploaded a video to elaborate itself. Before I convert from Windows 7, this issue never happened. The symptom is so random that I cannot reproduce or tell precisely when will this happen again. My wild guess is, should this be related to driver? Below are my detail system information: $ lspci 00:00.0 Host bridge: Intel Corporation 2nd Generation Core Processor Family DRAM Controller (rev 09) 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 00:16.0 Communication controller: Intel Corporation 6 Series/C200 Series Chipset Family MEI Controller #1 (rev 04) 00:1a.0 USB Controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #2 (rev 05) 00:1b.0 Audio device: Intel Corporation 6 Series/C200 Series Chipset Family High Definition Audio Controller (rev 05) 00:1c.0 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 1 (rev b5) 00:1c.3 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 4 (rev b5) 00:1d.0 USB Controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #1 (rev 05) 00:1f.0 ISA bridge: Intel Corporation H67 Express Chipset Family LPC Controller (rev 05) 00:1f.2 SATA controller: Intel Corporation 6 Series/C200 Series Chipset Family 6 port SATA AHCI Controller (rev 05) 00:1f.3 SMBus: Intel Corporation 6 Series/C200 Series Chipset Family SMBus Controller (rev 05) 02:00.0 Ethernet controller: Broadcom Corporation NetLink BCM57788 Gigabit Ethernet PCIe (rev 01) is there any other information i need to post and how do i do that?

    Read the article

  • Ubuntu and bumblebee and intel problem

    - by LnxSlck
    I have a brain teaser that i can't solve. When 12.04 (64bits) came out, i did a fresh install and then installed bumblebee with: sudo add-apt-repository ppa:bumblebee/stable sudo apt-get install bumblebee bumblebee-nvidia And Ubuntu recognized (System Settings - Details) Intel Card for primary card, and i could launch applications with optirun. I have: 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation GF119 [GeForce GT 520MX] (rev ff) Now, a few days back, i did the exact same thing, installed from the same DVD everything the same, installed bumblebee the same way. Nvidia with Optirun works fine, but Ubuntu doesn't have 3D effects: root@deathstar:~# optirun /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce GT 520MX/PCIe/SSE2 OpenGL version string: 4.2.0 NVIDIA 295.40 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: no GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no I never did anything to install Intel drivers (except installing mesa-utils), before bumblebee got everything working, but now i can't get Ubuntu to get Unity with 3D. Can someone help me please get Unity 3d working? Your help will be much appreciated

    Read the article

  • 2 folders in Sys/Class/Backlight?

    - by zebrapie
    ISSUE: Backlight brightness does not change. More Detail: Brightness will not change, using both 'System Settings-Screen', or FN keys (Brightness bar shows and moves, but screen brightness does not change). Notcied a post in this thread (http://ubuntuforums.org/showthread.php?t=1866283) about having multiple folders in Sys-Class-Backlight... I HAVE TWO FOLDERS TOO! 'intel_backlight' and 'acpi_video0' Using the function keys, alters the value in the acpi_video0's 'Brightness' file - But doesn't actually alter the brightness of the screen. If I add 'backlight=vendor' in Grub, my function keys then edit the value in the 'Intel_Backlight brightness file. - But again doesnt actually change the brightness of the screen. Computer: Fujitsu Siemans Pi2515, Intel Integrated Graphics, No hdd partition. Already Tried: -Editing grub to contain: acpi_osi=Linux acpi_backlight=vendor -http://ubuntuguide.net/change-screen-brightness-with-fn-key-in-ubuntu-11-0410-10 -sudo apt-get install acpi -$ sudo setpci -s 00:02.0 F4.B=20 -Brightness does not adjust in fallback mode either. -Reinstalling OS, Using Linux Mint (Same problem). -Upgrading and downgrading BIOS. Many thanks for reading, I understand this problem may need a bit of a Linux pro to sort. If anyones up for the challenge i'll spend any amount of time being walked through this, posting results. Don't want to give up here!

    Read the article

  • pitfalls/disadvantages of functional programming

    - by CrazyJugglerDrummer
    When would you NOT want to use functional programming? What is it not so good at? I am more looking for disadvantages of the paradigm as a whole, not things like "not widely used", or "no good debugger available". Those answers may be correct as of now, but they deal with FP being a new concept (an unavoidable issue) and not any inherent qualities. Related: pitfalls of object oriented programming advantages of functional programming

    Read the article

  • Evolution of mainstream programming languages: simplicity versus complexity.

    - by Giorgio
    I had posted this question on http://stackoverflow.com but I was suggested that it may be more appropriate to post it on this forum. I did a quick search on this site and it seems to me that this question has not been asked yet. Please give me a hint if the topic has been raised already by someone else. Update I have rephrased this question, removed personal opinions and made it shorter. I hope in this way it is better suited for this forum. By looking at the recent development of Java (Java 7) and C++ (C++0x) I see that new features are added to these languages. For sure this makes it easier to use certain programming idioms, adding to the productivity of developers. On the other hand, there might be the following risks A language becomes too big, complex, and difficult to understand. It lacks coherence in the design, e.g. if it mixes different paradigms like object-orientation and functional programming, which might not fit well together. Questions: what is more important to you as a developer: to have a rich language that captures a large collection of programming idioms or to have a small language that aims at coherence and simplicity (of course, with a good deal of libraries and tools accompanying it)? Or is it possible to have both? With respect to these issues: How do you judge the current evolutions of main-stream programming languages like Java or C++? Are they becoming too complex, less intuitive? Do they have enough features? Do they need more? Are they still easy enough to understand and use?

    Read the article

  • Is 2 lines of push/pop code for each pre-draw-state too many?

    - by Griffin
    I'm trying to simplify vector graphics management in XNA; currently by incorporating state preservation. 2X lines of push/pop code for X states feels like too many, and it just feels wrong to have 2 lines of code that look identical except for one being push() and the other being pop(). The goal is to eradicate this repetitiveness,and I hoped to do so by creating an interface in which a client can give class/struct refs in which he wants restored after the rendering calls. Also note that many beginner-programmers will be using this, so forcing lambda expressions or other advanced C# features to be used in client code is not a good idea. I attempted to accomplish my goal by using Daniel Earwicker's Ptr class: public class Ptr<T> { Func<T> getter; Action<T> setter; public Ptr(Func<T> g, Action<T> s) { getter = g; setter = s; } public T Deref { get { return getter(); } set { setter(value); } } } an extension method: //doesn't work for structs since this is just syntatic sugar public static Ptr<T> GetPtr <T> (this T obj) { return new Ptr<T>( ()=> obj, v=> obj=v ); } and a Push Function: //returns a Pop Action for later calling public static Action Push <T> (ref T structure) where T: struct { T pushedValue = structure; //copies the struct data Ptr<T> p = structure.GetPtr(); return new Action( ()=> {p.Deref = pushedValue;} ); } However this doesn't work as stated in the code. How might I accomplish my goal? Example of code to be refactored: protected override void RenderLocally (GraphicsDevice device) { if (!(bool)isCompiled) {Compile();} //TODO: make sure state settings don't implicitly delete any buffers/resources RasterizerState oldRasterState = device.RasterizerState; DepthFormat oldFormat = device.PresentationParameters.DepthStencilFormat; DepthStencilState oldBufferState = device.DepthStencilState; { //Rendering code } device.RasterizerState = oldRasterState; device.DepthStencilState = oldBufferState; device.PresentationParameters.DepthStencilFormat = oldFormat; }

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >