Search Results

Search found 8172 results on 327 pages for 'vector graphics'.

Page 8/327 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Trouble with Graphics Settings

    - by user291775
    I recently tried to install Ubuntu 14 Lts alongside Windows XP pro on my 2005 dell dimension E510. Everything appeared to be working correctly until I tried to log in, at which point it froze with a blank color background, which would flicker to black every other time I hit a key or clicked the mouse. I then tried booting in graphical safe mode at which point it told me that it could not configure the graphics settings. Does any body know what's going on. Thank you for any suggestions.

    Read the article

  • Adding 2D vector movement with rotation applied

    - by Michael Zehnich
    I am trying to apply a slight sine wave movement to objects that float around the screen to make them a little more interesting. I would like to apply this to the objects so that they oscillate from side to side, not front to back (so the oscillation does not affect their forward velocity). After reading various threads and tutorials, I have come to the conclusion that I need to create and add vectors, but I simply cannot come up with a solution that works. This is where I'm at right now, in the object's update method (updated based on comments): Vector2 oldPosition = new Vector2(spritePos.X, spritePos.Y); //note: newPosition is initially set in the constructor to spritePos.x/y Vector2 direction = newPosition - oldPosition; Vector2 perpendicular = new Vector2(direction.Y, -direction.X); perpendicular.Normalize(); sinePosAng += 0.1f; perpendicular.X += 2.5f * (float)Math.Sin(sinePosAng); spritePos.X += velocity * (float)Math.Cos(radians); spritePos.Y += velocity * (float)Math.Sin(radians); spritePos += perpendicular; newPosition = spritePos;

    Read the article

  • Rotate a vector

    - by marc wellman
    I want my first-person camera to smoothly change its viewing direction from direction d1 to direction d2. The latter direction is indicated by a target position t2. So far I have implemented a rotation that works fine but the speed of the rotation slows down the closer the current direction gets to the desired one. This is what I want to avoid. Here are the two very simple methods I have written so far: // this method initiates the direction change and sets the parameter public void LookAt(Vector3 target) { _desiredDirection = target - _cameraPosition; _desiredDirection.Normalize(); _rotation = new Matrix(); _rotationAxis = Vector3.Cross(Direction, _desiredDirection); _isLooking = true; } // this method gets execute by the Update()-method if _isLooking flag is up. private void _lookingAt() { dist = Vector3.Distance(Direction, _desiredDirection); // check whether the current direction has reached the desired one. if (dist >= 0.00001f) { _rotationAxis = Vector3.Cross(Direction, _desiredDirection); _rotation = Matrix.CreateFromAxisAngle(_rotationAxis, MathHelper.ToRadians(1)); Direction = Vector3.TransformNormal(Direction, _rotation); } else { _onDirectionReached(); _isLooking = false; } } Again, rotation works fine; camera reaches its desired direction. But the speed is not equal over the course of movement - it slows down. How to achieve a rotation with constant speed ?

    Read the article

  • Vector transform equation explanation

    - by cyberdemon
    I'm trying to understand the maths of moving points in a 3d space by making a game written in C#. I'm looking at this wolfire blog series which explains some basic 3d maths. I've read the first two parts but am stuck on the 3rd. I know it's all really rudimentary stuff but I find Googling for help with equations really hard. The one I'm struggling with is: 0*(0.66,0.75) + 2*(-0.75, 0.66) = (-1.5, 1.3) How can anything multiplied by 0 not be 0? So my question is how does this look in code: x(a,b) + y(c,d) I know it's basic stuff but I just can't see it.

    Read the article

  • Intel graphics driver installer, now the CPU fan is rarely quiet

    - by Space monkey
    I have an Optimus chipset: Intel HD 4000 (i7-3635QM CPU) Geforce 640m I don't care about the NVIDIA card, so I didn't try to install any proprietary drivers for it. So: I was having a choppy+high CPU experience with gnome-shell on Ubuntu 14.04. Only happened when I tried moving a window around quickly. I used the Intel graphics installer hoping that it will fix the problem. It did fix the problem, now there is no choppyness or high CPU when I move windows around. However, there is a new problem now: The fan is rarely quiet, doing barely anything at all will cause the fan to go into loud mode quickly. That happens despite the CPU usage being at just around 4%. This wasn't the case before installing Intel drivers. It would normally only do that if, for example, I'm installing packages or doing something that puts some stress on the CPU. I set all CPU cores to "powersave" using cpufreq-set, but nothing changed. Also on Windows, the fans are really quiet when I'm in powersave mode. I believe they completely shut off for most of the time. I remember the installer giving me a report at the end as to which packages it installed. Unfortunately, I didn't save the report and I don't know where it would have saved it if it did. Any ideas or similar experiences?

    Read the article

  • How do I get Intel 845g graphics working?

    - by Rayson Jimenez
    I need help enabling Intel 845g to run with intel drivers. I'v looked everywhere on the net with no joy. I can't seem to get intel video drivers to load the gui, only with vesa drivers. it drops me straight to the shell prompt. Xorg.0.log shows the following. Any help would be GREATLY appreciated. [ 244.843] (II) LoadModule: "intel" [ 244.843] (II) Loading /usr/lib/xorg/modules/drivers/intel_drv.so [ 244.844] (II) Module intel: vendor="X.Org Foundation" [ 244.844] compiled for 1.9.0, module version = 2.12.0 [ 244.844] Module class: X.Org Video Driver [ 244.844] ABI class: X.Org Video Driver, version 8.0 [ 244.844] (II) intel: Driver for Intel Integrated Graphics Chipsets: i810, i810-dc100, i810e, i815, i830M, 845G, 852GM/855GM, 865G, 915G, E7221 (i915), 915GM, 945G, 945GM, 945GME, Pineview GM, Pineview G, 965G, G35, 965Q, 946GZ, 965GM, 965GME/GLE, G33, Q35, Q33, GM45, 4 Series, G45/G43, Q45/Q43, G41, B43, B43, Clarkdale, Arrandale, Sandybridge, Sandybridge, Sandybridge, Sandybridge, Sandybridge, Sandybridge, Sandybridge [ 244.844] (--) using VT number 8 [ 244.971] (EE) intel(0): No kernel modesetting driver detected. [ 244.971] (II) UnloadModule: "intel" [ 244.971] (EE) Screen(s) found, but none have a usable configuration. [ 244.971] Fatal server error: [ 244.971] no screens found [ 244.971] Please consult the The X.Org Foundation support at http://wiki.x.org for help. [ 244.971] Please also check the log file at "/var/log/Xorg.0.log" for additional information. [ 244.971] [ 245.213] ddxSigGiveUp: Closing log

    Read the article

  • Newly installed Ubuntu 12.10 and weird graphics

    - by Benji Marshall
    My machine: 2 GB RAM Intel Pentium Dual core E2180 @ 2 GHz NVIDIA GeForce 6200 LE My friend had recommended Ubuntu to me and I thought I might as well get used to Linux in anticipation for my Raspberry Pi. He said that Wubi was the easiest way to install and I installed it using Wubi. On my first ever boot up of Ubuntu from the Windows Bootloader started normally, and I logged on in a normal fashion, and my desktop loaded normally. I then pressed the Windows key/Power key and everything went wrong. Random lines of yellow and blue appeared on my screen, and changed location when I moved my mouse. The lines stayed for a few seconds and then partially went to I could sort of use my computer. I tried moving my mouse and the entire desktop looked like it broke apart, fragments of it just scatter across my screen at random angles. I could move my mouse and the pointer would move but clicking did nothing. I had to turn off my machine by removing the plug. I would love to get off Windows, but at least the doesn't completely mess up the graphics, and is relatively usable. Please help me solve this....

    Read the article

  • Graphics hardware warning when updating to 14.04

    - by pacomet
    As I use Ubuntu at work I just update only LTS versions but now I'm not sure if I can/should. As my computer is now ten years old I would change if was mine but as it is owned by my employer I have to work with it. It's not a bad one, it runs fine (this was not true when still had Windows on it ;-). When updating to 14.04, it warns about possible bad/slow performance with Unity 3D so I stop updating as I am at work, not my own computer. As I understand from http://askubuntu.com/a/438958/25305 Nvidia Geforce FX 5500 graphics card is still supported in 14.04. Now, in 12.04, I have driver version 173 and unity 2d runs fine for me. output of /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce FX 5500/AGP/SSE2 OpenGL version string: 2.1.2 NVIDIA 173.14.39 Not software rendered: yes Not blacklisted: no GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no Should I update? Is it better to stay with 12.04? Thanks

    Read the article

  • std::vector elements initializing

    - by Chameleon
    std::vector<int> v1(1000); std::vector<std::vector<int>> v2(1000); std::vector<std::vector<int>::const_iterator> v3(1000); How elements of these 3 vectors initialized? About int, I test it and I saw that all elements become 0. Is this standard? I believed that primitives remain undefined. I create a vector with 300000000 elements, give non-zero values, delete it and recreate it, to avoid OS memory clear for data safety. Elements of recreated vector were 0 too. What about iterator? Is there a initial value (0) for default constructor or initial value remains undefined? When I check this, iterators point to 0, but this can be OS When I create a special object to track constructors, I saw that for first object, vector run the default constructor and for all others it run the copy constructor. Is this standard? Is there a way to completely avoid initialization of elements? Or I must create my own vector? (Oh my God, I always say NOT ANOTHER VECTOR IMPLEMENTATION) I ask because I use ultra huge sparse matrices with parallel processing, so I cannot use push_back() and of course I don't want useless initialization, when later I will change the value.

    Read the article

  • Cast vector<T> to vector<const T>

    - by user345386
    I have a member variable of type vector (where is T is a custom class, but it could be int as well.) I have a function from which I want to return a pointer to this vector, but I don't want the caller to be able to change the vector or it's items. So I want the return type to be const vector* None of the casting methods I tried worked. The compiler keeps complaining that T is not compatible with const T. Here's some code that demonstrates the gist of what I'm trying to do; vector<int> a; const vector<const int>* b = (const vector<const int>* ) (&a); This code doesn't compile for me. Thanks in advance!

    Read the article

  • C++ vector reference parameter

    - by Archanimus
    Hello folks, let's say we have a class class MyClass { vector<vector<int > > myMatrice; public : MyClass(vector<vector<int > > &); } MyClass::MyClass(vector<vector<int > > & m) { myMatrice = m; } During the instanciation of MyClass, I pass a big vector < vector < int and I find that the object is actually copied and not only the reference, so it takes the double of the memory ... Please, can anyone help me out with this problem, I'm stuck since too many time ... And thanks a lot!

    Read the article

  • Would vector of vectors be contiguous?

    - by user1150989
    I need to allocate a vector of rows where row contains a vector of rows. I know that a vector would be contiguous. I wanted to know whether a vector of vectors would also be contiguous. Example code is given below vector<long> firstRow; firstRow.push_back(0); firstRow.push_back(1); vector<long> secondRow; secondRow.push_back(0); secondRow.push_back(1); vector< vector < long> > data; data.push_back(firstRow); data.push_back(secondRow); Would the sequence in memory be 0 1 0 1?

    Read the article

  • Populate array from vector

    - by Zag zag..
    Hi, I would like to populate an 2 dimensional array, from a vector. I think the best way to explain myself is to put some examples (with a array of [3,5] length). When vector is: [1, 0] [ [4, 3, 2, 1, 0], [4, 3, 2, 1, 0], [4, 3, 2, 1, 0] ] When vector is: [-1, 0] [ [0, 1, 2, 3, 4], [0, 1, 2, 3, 4], [0, 1, 2, 3, 4] ] When vector is: [-2, 0] [ [0, 0, 1, 1, 2], [0, 0, 1, 1, 2], [0, 0, 1, 1, 2] ] When vector is: [1, 1] [ [2, 2, 2, 1, 0], [1, 1, 1, 1, 0], [0, 0, 0, 0, 0] ] When vector is: [0, 1] [ [2, 2, 2, 2, 2], [1, 1, 1, 1, 1], [0, 0, 0, 0, 0] ] Have you got any ideas, a good library or a plan? Any comments are welcome. Thanks. Note: I consulted Ruby "Matrix" and "Vector" classes, but I don't see any way to use it in my way... Edit: In fact, each value is the number of cells (from the current cell to the last cell) according to the given vector. If we take the example where the vector is [-2, 0], with the value *1* (at array[2, 3]): array = [ [<0>, <0>, <1>, <1>, <2>], [<0>, <0>, <1>, <1>, <2>], [<0>, <0>, <1>, *1*, <2>] ] ... we could think such as: The vector [-2, 0] means that -2 is for cols and 0 is for rows. So if we are in array[2, 3], we can move 1 time on the left (left because 2 is negative) with 2 length (because -2.abs == 2). And we don't move on the top or bottom, because of 0 for rows.

    Read the article

  • Ubuntu 12.10 graphics does not work properly

    - by madox2
    My graphic on ubuntu 12.10 does not work as well as on 12.04. After upgrade I installed driver sudo apt-add-repository ppa:ubuntu-x-swat/x-updates sudo apt-get update sudo apt-get install nvidia-current for my Nvidia 450 GTS graphics card. But sometimes I see slight lag on my videos played in VLC player, some of desktop and window effects are lagging, sometimes I can see an indescribable souce of pixels on my screen at the start of ubuntu and so on. I feel difference between 12.04 and 12.10 in favour of former version. Does anyone know whats wrong or what I am missing? here is output of lspci -k: 00:00.0 Host bridge: Intel Corporation 2nd Generation Core Processor Family DRAM Controller (rev 09) 00:01.0 PCI bridge: Intel Corporation Xeon E3-1200/2nd Generation Core Processor Family PCI Express Root Port (rev 09) Kernel driver in use: pcieport Kernel modules: shpchp 00:16.0 Communication controller: Intel Corporation 6 Series/C200 Series Chipset Family MEI Controller #1 (rev 04) Subsystem: Giga-byte Technology Device 1c3a Kernel driver in use: mei Kernel modules: mei 00:1a.0 USB controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #2 (rev 05) Subsystem: Giga-byte Technology Device 5006 Kernel driver in use: ehci_hcd 00:1b.0 Audio device: Intel Corporation 6 Series/C200 Series Chipset Family High Definition Audio Controller (rev 05) Subsystem: Giga-byte Technology Device a000 Kernel driver in use: snd_hda_intel Kernel modules: snd-hda-intel 00:1c.0 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 1 (rev b5) Kernel driver in use: pcieport Kernel modules: shpchp 00:1c.4 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 5 (rev b5) Kernel driver in use: pcieport Kernel modules: shpchp 00:1d.0 USB controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #1 (rev 05) Subsystem: Giga-byte Technology Device 5006 Kernel driver in use: ehci_hcd 00:1e.0 PCI bridge: Intel Corporation 82801 PCI Bridge (rev a5) 00:1f.0 ISA bridge: Intel Corporation H61 Express Chipset Family LPC Controller (rev 05) Subsystem: Giga-byte Technology Device 5001 Kernel driver in use: lpc_ich Kernel modules: lpc_ich 00:1f.2 IDE interface: Intel Corporation 6 Series/C200 Series Chipset Family 4 port SATA IDE Controller (rev 05) Subsystem: Giga-byte Technology Device b002 Kernel driver in use: ata_piix 00:1f.3 SMBus: Intel Corporation 6 Series/C200 Series Chipset Family SMBus Controller (rev 05) Subsystem: Giga-byte Technology Device 5001 Kernel modules: i2c-i801 00:1f.5 IDE interface: Intel Corporation 6 Series/C200 Series Chipset Family 2 port SATA IDE Controller (rev 05) Subsystem: Giga-byte Technology Device b002 Kernel driver in use: ata_piix 01:00.0 VGA compatible controller: NVIDIA Corporation GF116 [GeForce GTS 450] (rev a1) Subsystem: CardExpert Technology Device 0401 Kernel driver in use: nvidia Kernel modules: nvidia_current, nouveau, nvidiafb 01:00.1 Audio device: NVIDIA Corporation GF116 High Definition Audio Controller (rev a1) Subsystem: CardExpert Technology Device 0401 Kernel driver in use: snd_hda_intel Kernel modules: snd-hda-intel 03:00.0 Ethernet controller: Atheros Communications Inc. AR8151 v2.0 Gigabit Ethernet (rev c0) Subsystem: Giga-byte Technology Device e000 Kernel driver in use: atl1c Kernel modules: atl1c

    Read the article

  • Will Ubuntu 11.04 work on my old pc?

    - by M4tic
    It has an old SiS based graphics chip with 64mb memory, there were never 3D drivers for them. Ubuntu 10.10 doesn't even boot on it and my laptop, a Lenovo N200 with intel 9something graphics 128mb so I'm really worried. Broadband prices don;t come cheap in South Africa so the download has to be worth it. I can wait for the shipment but I've since orderd 10.10 and it hasn't come, so I don't know if I'm barred from using that service since I've been sent a disc every release.

    Read the article

  • How to enable desktop effects on Ubuntu 10.04 after upgrade from Ubuntu 8.04?

    - by Manohar Bhattarai
    I upgraded my Ubuntu 8.04 to Ubuntu 10.04. When I try to enable desktop effects it says "Desktop effects could not be enabled". The output of "lspci | grep VGA" is : 00:02.0 VGA compatible controller: Intel Corporation 82845G/GL[Brookdale-G]/GE Chipset Integrated Graphics Device (rev 03) Hardware drivers says there is no propriority hardware driver. I installed nVidia driver but I think my is an Intel graphics device. Please help.

    Read the article

  • Does upgrading RAM causes increase in Graphic card's share?

    - by A.S.
    I have asked this question on Ask Ubuntu, and I was suggested to Upgrade RAM from most voted answers. But I got a point about my graphics card. Since, I can upgrade RAM and not graphics card, Does upgrading RAM also cause graphics memory to increase. To clear the point: My specs are given below: Laptop : Lenovo 3000 Y410. (bought in 2008 October) RAM: 1 GB (DDR2) External Graphics (Dedicated): N/A Internal Graphics (Shared): 256 MB Graphics Chipset: Intel GMA X3100 My Question is: If I increase my RAM to 3 GB, will it increase graphics cards share of the Memory. In other word, If graphics card shares 256 MB in 1GB RAM, will it share more, when I upgrade the RAM into 2GB or more ? Authentic resource link will be much appreciated I have recently known that, My chipset GMA X3100 can address 384 MB of RAM. So the question.

    Read the article

  • Graphics card initialisation problems when booting - requires a "double" boot

    - by DMA57361
    Problem Outline When booting from cold (and my machine is disconnected from main power when off, but leaving it connected doesn't help) the graphics card (single PCI-e card GeForce 460) will not initialise on the first boot, leaving me with the motherboards on-board graphics (which kick in automatically if no PCI-e card is found). However, if I restart the computer - normally I do this by powering it off just after the numlock lights up on the keyboard (ie, just after POST/BIOS and before Windows takes over), wait for the system to whirr down, and power up again - the graphics card will work correctly. Once double-booted in this matter the system seems to work correctly - with no noticeable problems. This is reproducible every time I try to boot - it has been working like this for about a month now. Background Information Sept 2010 - I suffered a hardware malfunction (crashes in Windows and graphics corruption on BIOS screens). By way of spare hardware I determined that replacing the PSU removed the issue, so I replaced the PSU with a brand new one of slightly higher power (460W replaced with 500W). Oct 2010 - The problem resurfaced. I purchased a new graphics card (GeForce 460), which removed the problem. The new graphics card immediately started having the boot initialisation problems mentioned. I presumed there was a motherboard fault all along, but because the system worked once booted, and I was temporarily out of spare money, I left the system alone and continued to use it. Early/Mid Dec 2010 - In the space of 5 days I recieved 3 instances of hard drive corruption (seemlingly fixed by chkdsk and sfc in each case...). Since I was already under the impression the motherboard was faulty, I purchased a new one ASAP, this also required new RAM (as I dropped from 4 slots to 2 and didn't want to drop mem quantity). Past 3-4 weeks - With a brand new PSU, Graphics Card, Motherboard and RAM I'm suffering the problem outlined above. So, what could be causing this and how do I can resolve it? Additional Notes Once double-booted the system seems to work entirely correctly. The graphics card problem has occured on two entirely different motherboards. I do not have the opportunity to test the graphics card in a different computer (I've only the old motherboard, which is dubious, or a really old desktop that still has an AGP port). Under load (ie, modern games for long enough for temperatures to plateau) the system remains stable and performs as expected. The software that came with the new motherboard and SpeenFan both report all voltages and temperatures are within nominal bounds, when idle and when under load. I've looking over the BIOS settings for my motherboard multiple times and can find nothing that helps. This system is configured to run with everything at standard levels - no overclocking. I've tried booting the system with only the mobo and graphics card connected (thinking maybe my new PSU was too weak for the new gfx card, even though it meets the quoted PSU requirements for the card) but the same problem persists (and really if the PSU was weak I'd have problems with the system under load). When the gfx card does not initialise the fan on its cooling unit is running, possibly slower than otherwise - but this measurement is by eye and so unreliable.

    Read the article

  • How to compare speed of graphics cards in Windows desktop environment?

    - by Al Kepp
    I use Windows 7 and Intel Core i3 CPU with integrated graphics. My problem is that it eats valuable system RAM for display. I can replace it with an old PCIe Radeon X700, so all system RAM will be usable for applications. The question is if an old Radeon X700 is comparable in W7 desktop speed to a new integrated i3 graphics. Are there any test programs which compare the speed of graphic cards in Windows 7 desktop environment (i.e. no Direct3D games, just Windows desktop)? (According to Tomshardware, Radeon X700 is probably even faster than Core i3 in 3D. But there are no native WDDM 1.1 W7 drivers for X700, only WDDM 1.0 Vista drivers are available.)

    Read the article

  • How can I move a polygon edge 1 unit away from the center?

    - by Stephen
    Let's say I have a polygon class that is represented by a list of vector classes as vertices, like so: var Vector = function(x, y) { this.x = x; this.y = y; }, Polygon = function(vectors) { this.vertices = vectors; }; Now I make a polygon (in this case, a square) like so: var poly = new Polygon([ new Vector(2, 2), new Vector(5, 2), new Vector(5, 5), new Vector(2, 5) ]); So, the top edge would be [poly.vertices[0], poly.vertices[1]]. I need to stretch this polygon by moving each edge away from the center of the polygon by one unit, along that edge's normal. The following example shows the first edge, the top, moved one unit up: The final polygon should look like this new one: var finalPoly = new Polygon([ new Vector(1, 1), new Vector(6, 1), new Vector(6, 6), new Vector(1, 6) ]); It is important that I iterate, moving one edge at a time, because I will be doing some collision tests after moving each edge. Here is what I tried so far (simplified for clarity), which fails triumphantly: for(var i = 0; i < vertices.length; i++) { var a = vertices[i], b = vertices[i + 1] || vertices[0]; // in case of final vertex var ax = a.x, ay = a.y, bx = b.x, by = b.y; // get some new perpendicular vectors var a2 = new Vector(-ay, ax), b2 = new Vector(-by, bx); // make into unit vectors a2.convertToUnitVector(); b2.convertToUnitVector(); // add the new vectors to the original ones a.add(a2); b.add(b2); // the rest of the code, collision tests, etc. } This makes my polygon start slowly rotating and sliding to the left, instead of what I need. Finally, the example shows a square, but the polygons in question could be anything. They will always be convex, and always with vertices in clockwise order.

    Read the article

  • Blending Three Images into Graphics Context Using Alpha Blend Mode kBlendModeOverlay

    - by steganous
    Does kCGBlendModeOverlay not work exactly like Photoshop's Overlay blending mode? I'm trying to overlay three images into a graphic context via: [uiimageGreen drawAtPoint:CGPointMake(x, y) blendMode:kCGBlendModeOverlay alpha:1.0]; [uiimageRed drawAtPoint:CGPointMake(x, y) blendMode:kCGBlendModeOverlay alpha:1.0]; [uiimageBlue drawAtPoint:CGPointMake(x, y) blendMode:kCGBlendModeOverlay alpha:1.0]; In the end, if I overlay just two of the three, the result is much closer to my desired output color in places where both images intersect. Adding the third image, however, causes the first-drawn image's color to be dominant in the resulting mix of colors. (e.g. in the above code, green comes out dominant, when the result should actually be white) Do you get the same result if you try?

    Read the article

  • Touch draw in Quatz 2D/Core Graphics

    - by OgreSwamp
    Hello, I'm trying to implement "hand draw tool". At the moment algorythm looks like that (I don't insert any code because methods are quite big, will try to explain an idea): Drawing In touchesStarted: method I create NSMutableArray *pointsArray and add point into it. Call setNeedsDisplay: method. In touchesMoved: method I calculate points between last added point from the pointsArray and current point. Add all points to the pointsArray. Call setNeedsDisplay: method. In touchesFinished: event I calculate points between last added point from the array and current point. Set flag touchesWereFinished. Call setNeedsDisplay:. Render: drawRect: method checks is pointsArray != nil and is there any data in it. If there is - it starts to traw circles in each point of this array. If flag touchesWereFinished is set - save current context to the UIImage, release pointsArray, set it to nil and reset the flag. There are a lot disadvantages of this method: It is slow It becomes extremely slow when user touches and move finger for long time. Array becomes enormous "Lines" composed by circles are ugly I would like to change my algorithm to make it bit faster and line smoother. In result I would like to have lines like on the picture at following URL (sorry, not enough reputation to insert an image): http://2.bp.blogspot.com/_r5VzEAUYXJ4/SrOYp8tJCPI/AAAAAAAAAMw/ZwDKXiHlhV0/s320/SketchBook+Mobile(4).png Can you advice me, ho I can draw lines this way (smooth and slim on the edges)? I thought to draw circles with alpha gradient on the edges (to make lines smoother), but it will be extremely slowly IMHO. Thanks for help

    Read the article

  • Disappeared graphics card

    - by lenovo user
    I have a Lenovo T520 with two graphics cards, an nVidia quadro and an intel graphics card. I'm running a Ubuntu and Windows 7 dual boot. I can no longer find any trace of my intel graphics card. In my linux boot: > lspci | grep VGA > 01:00.0 VGA compatible controller: nVidia Corporation GF106 [Quadro 2000M] (rev a1) In Windows in control panel display- advanced settings, I only see the NVIDIA Quadro 2000M. In the BIOS there is no mention of the intel graphics card, no where I can find to try and turn it on or off. I thought I was going crazy, but then I found a post I made on ask ubuntu I made 3 months ago where I listed the output of lspci on this same machine: lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: nVidia Corporation GF106 [Quadro 2000M] (rev a1) What is going on? How could my intel graphics card have been disabled or turned off somehow without my knowledge? I've been in the BIOS 3 times now, each time convinced the last time I must have missed something, but I always find nothing. Am I missing something there? Could a thief have opened my computer and stolen my graphics card?

    Read the article

  • Looking for "bitmap-vector" image editor

    - by Borek
    I used to use PhotoImpact which is no longer developed so I'm looking for a replacement. What made PhotoImpact great to me was the ability to work in both bitmap and vector modes. What I mean by that: I could have an image or screenshot and easily add arrows, text captions or shapes to it. These shapes were vector objects so I could come back to them later and amend their properties easily. Software I know of: Paint.NET is purely bitmap so please don't recommend it - layers are not enough for my needs Drawing tools in MS Office work pretty much the way I'd like - you can paste an image and then add vector objects on top of it. It just doesn't feel right to have the full-fidelity original images stored as .docx or .pptx (I don't fully trust Word/Powerpoint that they don't compress the image) I'm not sure about GIMP but if it's just "better Paint.NET" (i.e., layers but no vector objects) I'm not interested Photoshop is out of question purely because of its price tag Corel killed PhotoImpact because they already had a competing product (Paint Shop Pro) but AFAIK it lacks vector features. Any tips for PhotoImpact alternatives would be very welcome.

    Read the article

  • KD-Trees and missing values (vector comparison)

    - by labratmatt
    I have a system that stores vectors and allows a user to find the n most similar vectors to the user's query vector. That is, a user submits a vector (I call it a query vector) and my system spits out "here are the n most similar vectors." I generate the similar vectors using a KD-Tree and everything works well, but I want to do more. I want to present a list of the n most similar vectors even if the user doesn't submit a complete vector (a vector with missing values). That is, if a user submits a vector with three dimensions, I still want to find the n nearest vectors (stored vectors are of 11 dimensions) I have stored. I have a couple of obvious solutions, but I'm not sure either one seem very good: Create multiple KD-Trees each built using the most popular subset of dimensions a user will search for. That is, if a user submits a query vector of thee dimensions, x, y, z, I match that query to my already built KD-Tree which only contains vectors of three dimensions, x, y, z. Ignore KD-Trees when a user submits a query vector with missing values and compare the query vector to the vectors (stored in a table in a DB) one by one using something like a dot product. This has to be a common problem, any suggestions? Thanks for the help.

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >