Search Results

Search found 28559 results on 1143 pages for 'upgrade issue'.

Page 648/1143 | < Previous Page | 644 645 646 647 648 649 650 651 652 653 654 655  | Next Page >

  • Why CFOs Should Care About Big Data

    - by jmorourke
    The topic of “big data” clearly has reached a tipping point in 2012.  With plenty of coverage over the past few years in the IT press, we are now starting to see the topic of “big data” covered in mainstream business press, including a cover story in the October 2012 issue of the Harvard Business Review.  To help customers understand the challenges of managing “big data” as well as the opportunities that can be created by leveraging “big data”, Oracle has recently run and published the results of a customer survey, as well as white papers and articles on this topic.  Most recently, we commissioned a white paper titled “Mastering Big Data: CFO Strategies to Transform Insight into Opportunity”. The premise here is that “big data” is not just a topic that CIOs should pay attention to, but one that CFOs should understand and take advantage of as well.  Clearly, whoever masters the art and science of big data will be positioned for competitive advantage in their industries or markets.  That’s why smart CFOs are taking control of big data and business analytics projects, not just to uncover new ways to drive growth in a slowing global economy, but also to be a catalyst for change in the enterprise.  With an increasing number of CFOs now responsible for overseeing IT investments and providing strategic insight to the board, CFOs will be increasingly called upon to take a leadership role in assessing the value of “big data” initiatives, building on their traditional skills in reporting and helping managers analyze data to support decision making. Here’s a link to the white paper referenced above, which is posted on the Oracle C-Central/CFO web site, as well as some other resources that can help CFOs master the topic of “big data”: White Paper “Mastering Big Data:  CFO Strategies to Transform Insight into Opportunity CFO Market Watch article:  “Does Big Data Affect the CFO?” Oracle Survey Report:  “From Overload to Impact – An Industry Scorecard on Big Data Industry Challenges” Upcoming Big Data Webcast with Andrew McAfee Here’s a general link to Oracle C-Central/CFO in case you want to start there: www.oracle.com/c-central/cfo Feel free to contact me if you have any questions or need additional information:  [email protected]

    Read the article

  • Unity3D Android : Game Over/Retry

    - by user3666251
    Im making a simple 2D game for android using the Unity3D game engine.I created all the levels and everything but Im stuck at making the game over/retry menu.So far I've been using new scenes as a game over menu.I used this simple script : pragma strict var level = Application.LoadLevel; function OnCollisionEnter(Collision : Collision) { if(Collision.collider.tag == "Player") { Application.LoadLevel("GameOver"); } } And this as a 'menu' : #pragma strict var myGUISkin : GUISkin; var btnTexture : Texture; function OnGUI() { GUI.skin = myGUISkin; if (GUI.Button(Rect(Screen.width/2-60,Screen.height/2+30,100,40),"Retry")) Application.LoadLevel("Easy1"); if (GUI.Button(Rect(Screen.width/2-90,Screen.height/2+100,170,40),"Main Menu")) Application.LoadLevel("MainMenu"); } The problem stands at the part where I have to create over 200 game over scenes,obscales(the objects that kill the player) and recreate the same script over 200 times for each level. Is there any other way to make this faster and less painful? I've been searching the web but didn't find anything useful according to my issue. Thank you.

    Read the article

  • Why is IIS 7.5 flushing file cache very often?

    - by Steffen
    We're running a Win 2008 R2 server with IIS 7.5 for serving image files. It's only used for static content, and file caching has been set up to cache files for 10 minutes. However the IIS frequently completely flushes the cache (seen by using Perfmon) It's not application pool recycling, it's not because the TTL has expired, so now I'm at a loss :-( I've included a screenshot of the perfmon graph where you can clearly see the issue. Is there anywhere I can see WHY it's doing these flushes ? (Note: I'm aware I could maybe detect it by attaching a debugger to the process, but that's not an option because it's a production server, and it cannot handle the slowdown a debugger would cause)

    Read the article

  • How to achieve reliable Gigabit Ethernet Link with my Acer Aspire Revo R3610?

    - by The Operator
    I want to stream HD movies over my wired Gigabit LAN from my PC to my Acer Aspire Revo R3610. It's connected with a 3ft Cat5e patch cable to my Netgear GS605v2 Switch. The PC acting as File Server is connected at 1Gbps to the Switch. Network driver options are set to defaults, including automatic speed/duplex negotiation on both machines. The Revo will not connect to my Network Switch at 1Gbps - the OS reports that it reverts to 100Mbps either shortly after connection or immediately upon connection. Through a process of elimination (trying different drivers, patch cables, ports on the switch, and other 1Gbps-capable devices connected to the Network switch which successfully achieve 1Gbps links and performance) I have drawn the conclusion there is either a Hardware or Software (Driver) issue with the Revo itself. I have performed tests using Windows 7 and Ubuntu 9.10. Can anyone offer insight on Gigabit Ethernet with the Revo?

    Read the article

  • Most effective work habit for coding? [on hold]

    - by Cris
    Working on a big solo project (~15,000 LOC), I am encountering the following phenomenon: I seem to work best when I program in short bursts of 10-15 minutes. Right now I am working on a section which is a complete first time for me architecturally and if I have any architectural issues that emerge when doing the implementation, I seem to be able to best serve these by taking a total break. Then, later, sketching out the ideas on some paper. And when I feel I have sufficient clarity, then going back to code. This iterates until that architectural issue for that section is resolved. This seems quite counter intuitive: that I can progress more quickly by coding less, and taking more breaks. I am nearing the end of the sections which are "first times" for me, and about to dive into stuff which I am much more familiar and am wondering if this counter intuitive efficiency will continue. So my question is: even for regular coding of sections one is familiar with, which don't require constant re-clarification of the best architecture, is more progress to be attained by taking more breaks and coding in bursts?

    Read the article

  • Severe latency only on one machine and only when accessing intranet site

    - by Joe M.
    I have one desktop machine that is having consistently high latency only when trying to load a page from an intranet site. Using the Chrome Developer Tools, the site shows a "Waiting" time of 4-5 seconds each page load. Other machines have <50ms, and the problem machine loads regular internet sites with <1s latency, so the problem is only on one machine and only when accessing the intranet site. This is a small business and all the hosts are on 192.168.0.1/24 I would have suspected a connection issue with the problem machine but normal internet sites are not having latency. Then I would have looked at connection issues with the intranet web server but other machines are not having latency to it. What else can I look at to troubleshoot this?

    Read the article

  • What can be done to decrease the number of live issues with applications?

    - by User Smith
    First off I have seen this post which is slightly similar to my question. : What can you do to decrease the number of deployment bugs of a live website? Let me layout the situation for you. The team of programmers that I belong to have metrics associated with our code. Over the last several months our errors in our live system have increased by a large amount. We require that our updates to applications be tested by at least one other programmer prior to going live. I personally am completely against this as I think that applications should be tested by end users as end users are much better testers than programmers, I am not against programmers testing, obviously programmers need to test code, but they are most of the times too close to the code. The reason I specify that I think end users should test in our scenario is due to the fact that we don't have business analysts, we just have programmers. I come from a background where BAs took care of all the testing once programmers checked off it was ready to go live. We do have a staging environment in place that is a clone of the live environment that we use to ensure that we don't have issues between development and live environments this does catch some bugs. We don't do end user testing really at all, I should say we don't really have anyone testing our code except programmers, which I think gets us into this mess (Ideally, we would have BAs or QA or professional testers test). We don't have a QA team or anything of that nature. We don't have test cases for our projects that are fully laid out. Ok, I am just a peon programmer at the bottom of the rung, but I am probably more tired of these issues than the managers complaining about them. So, I don't have the ability to tell them you are doing it all wrong.....I have tried gentle pushes in the correct direction. Any advice or suggestions on how to alleviate this issue ?

    Read the article

  • How to build the mainline kernel source package?

    - by Maxime R.
    Ubuntu kernel PPA only provides linux-headers*.deb and linux-image*.deb packages. How can I build the corresponding linux-source*.deb package ? Context: I'm currently running Ubuntu 11.10 with the mainline kernel (3.2 rc6 now) to get a better support for my sandybridge IGP (Dell E6420 laptop with intel i5-2520M CPU). Appears, i'd like to install this touchpad driver, ALPS touchpads being badly supported (see previous link bug report), while waiting for upstream support in kernel version 3.3. Problem is, DKMS keeps complaining about not finding the full kernel source: Module build for the currently running kernel was skipped since the kernel source for this kernel does not seem to be installed. Appears I may not need the full source but I'd still like to try having it installed to see if it solve my problem. What I tried : Uncompressing the kernel.org source archive in /usr/src/. DKMS still complaining. Manually updating the kernel source package with uupdate and the mainline source package like explained here. Did not succeed. Manually building the linux-source package following @roadmr and @elmicha instructions. I eventually succeeded to build it but DKMS still complained about the missing source. At last I noticed an error I did not catch in the first place while reinstalling the kernel headers. Appears the .deb I got may have been corrupted, downloading it again did the trick :) Alas, while DKMS agreed to compile the module i ran into the following error which appears to have already been reported. This issue isn't yet solved but I won't try to because of the following: in the end I decided to test the precise kernel version 3.2-rc6 through the xorg-edgers ppa which appears to be correctly patched: it works. Nevertheless, it might still be of some interest to know how to build the mainline linux-source package as the Ubuntu Kernel Team doesn't provide it. Not to mention that I learned a lot in the process ^^

    Read the article

  • Getting Unity 3D working on legacy Nvidia card

    - by user69545
    I installed the latest nVIDIA drivers for my FX5500 card. I understand that the X server version does not officially support this driver or card but was wondering what I can do to get compiz running. I have researched for hours on this issue but cannot come up with an answer for myself. I might be doing all this for nothing but I wanted to at least try. Here is the output of my test: mike@mike-linux-box:~$ /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce FX 5500/AGP/SSE2 OpenGL version string: 2.1.2 NVIDIA 173.14.35 Not software rendered: yes Not blacklisted: no GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no So I was wondering what is the "Not Blacklisted" test? Is this the Nouveau Blacklisting? nVIDIA driver did that automatically. Does this need to be removed? Any help would be appreciated. I just want to run compiz effects. Thanks.

    Read the article

  • VPN within a VM to allow for internet access on the host

    - by David Durrant
    I have a network connection (created under Networks and Sharing) that I use to connect to a customer's site. But when I use this to connect to the site, I loose all access to the public internet, and can only access customer specific items. I want to circumvent this issue by creating a VM and then utilizing the VM to connect to the network location and interact within the customer's domain, while leaving my host machine open to the internet. I'm not extremely familiar with networking, but I have a few basic skills. Please let me know if this is possible and what the correct procedures are. I already have a VM created with VirtualBox, and both the host and guest are running Windows 7 x64. I have created duplicate VPNs already, but can only connect successfully on the host machine.

    Read the article

  • IBM laptop remove access-connections

    - by Kevin
    I installed IBM Access Connections on my IBM t61 laptop to manage my wireless connections but it simply will not connect to my network. I want to uninstall it and try the Intel software which is capable to performing the same task. However, when I go into "Control Panel" - "Add/Remove Programs" and try to "Remove" this package, it simply opens up a screen and closes it immediately. This happens too fast for me to see whether its an error. Has anyone encountered this issue ? The event logs do not show any error.

    Read the article

  • Screen flicker -> Severe System Slowdown?

    - by Adam Robinson
    I'm using a Dell D830 laptop, and over the last few weeks it's been developing a very irritating screen flicker problem that leads to the system slowing down almost to the point of unusability. At seemingly random times (no commonality between how long the system has been running, what I was doing, what applications were open, etc.) my screen (I use two external LCD's with the laptop closed in a dock) flickers for a moment, then the system becomes incredibly slow. The screen redraws painfully slowly--almost like what you might expect to see with generic graphics drivers installed--and the entire system is maddeningly unresponsive. The only thing that seems to be able to correct the issue is a restart. I've checked the event logs and nothing out of the ordinary is there, and definitely nothing that's common to all of the events. I'm running XP Pro SP2. Any ideas?

    Read the article

  • Screen flickers after resuming from hibernate with Intel GMA 3600 (Acer D270 with Intel N2600)

    - by Cameron
    Fresh system install, with correct display driver from Intel (8.14.8.1065). Clicking "Update driver" merely results in a message saying the driver was already up-to-date. After resuming from hibernate the entire screen flickers. This is especially noticeable while using IE9 (which has hardware acceleration enabled by default) on Google maps, in particular while typing in an address in the map search field (the flickering is much worse under these circumstances). Note that sleep worked fine, only hibernate causes this issue. Restarting "fixes" the problem temporarily until the next hibernate-resume. Aero is enabled. This is on Windows 7 (Pro) 32-bit, on an Acer Aspire One D270-1998, with the Intel N2600 (which has the Intel GMA 3600 built-in).

    Read the article

  • Solution for storing sata drives outside of case

    - by Jeffrey Kevin Pry
    I have a system that has 8 sata disks in a software raid 5 array using mdadm. My issue is that I want to move the drives from inside of the computer case in order to cool more efficiently. I have looked all over the web and only seem to find enclosures that hide the drives connectors behind an estata port or some other internal raid controller. Basically what I want is an enclosure or equivalent that I can run independent sata cables to and either power as well, or have it have its own power supply. I have the sata ports on the motherboard available and don't want to limit io by using one port with a multiplier or the like. One final caveat, I am a college student on a budget and don't have a fortune to spend on such an enclosure. Thanks in advance for your help and advice.

    Read the article

  • Multiple "My Documents" folders?

    - by Flasimbufasa
    I am having the issue, where, I have 2 folders called "My Documents". I recently edited my registry to make "Document" link in the Windows 7 start menu be a FOLDER link and not a LIBRARY link.. Here is the Registry Key information for the Documents key: Key Name: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Explorer\FolderDescriptions\{7b0db17d-9cd2-4a93-9733-46cc89022e7c} Class Name: <NO CLASS> Last Write Time: 3/2/2011 - 2:33 AM Value 0 Name: Attributes Type: REG_DWORD Data: 0x1 Value 1 Name: Category Type: REG_DWORD Data: 0x4 Value 2 Name: Icon Type: REG_EXPAND_SZ Data: %SystemRoot%\system32\imageres.dll,-1002 Value 3 Name: LocalizedName Type: REG_EXPAND_SZ Data: @%SystemRoot%\system32\shell32.dll,-34575 Value 4 Name: Name Type: REG_SZ Data: Documents Value 5 Name: PublishExpandPath Type: REG_DWORD Data: 0x1 Value 6 Name: PrecCreate Type: REG_DWORD Data: 0x1 Value 7 Name: RelativePath Type: REG_SZ Data: Documents Value 8 Name: Roamable Type: REG_DWORD Data: 0x1 Also, Navigating through computer to "C:\Users\Flasimbufasa\" Only shows one folder called "Documents" However, whenever I navigate to user profile from "Desktop\Flasimbufasa" I get 2 Document folders. Any help?

    Read the article

  • Computer won't start unless power is removed for ~5 minutes

    - by Paul Tarjan
    I have a fairly standard 2-year old desktop computer (quad-core intel, single hard drive, decent video card, 300W power supply) which recently started acting up. I'm not sure what the cause is, so hopefully you can help. Sometimes (once a week-ish) I press the power button and nothing happens. No blinking, no sounds, no nothing. If I remove the power cord (or flip the switch on the power supply) I hear a capacitor discharge. If I leave it in the "no power at all" state for about 5 minutes then I can put the plug back in and the computer works perfectly. What is the issue? What do you think I have to replace?

    Read the article

  • Allow from referer for HTTP-basic protected SSL apache site

    - by user64204
    I have an apache site protected by HTTP basic authentication. The authentication is working fine. Now I would like to bypass authentication for users that are coming from a particular website by relying on the HTTP Referer header. Here is the configuration: SetEnvIf Referer "^http://.*.example\.org" coming_from_example_org <Directory /var/www/> Options Indexes FollowSymLinks MultiViews AllowOverride None Deny from all Allow from env=coming_from_example_org AuthName "login required" AuthUserFile /opt/http_basic_usernames_and_passwords AuthType Basic Require valid-user Satisfy Any </Directory> This is working fine for HTTP, but failing for HTTPS. My understanding is that in order to inspect the HTTP headers, the SSL handshake must be completed, but apache wants to inspect the <Directory> directives before doing the SSL handshake, even if I place them at the bottom of the configuration file. Q: How could I workaround this issue? PS: I'm not obsessed with the HTTP referer header, I could use other options that would allow users from a known website to bypass authantication.

    Read the article

  • Disk quota problem in Windows Server SBS 2003

    - by deddebme
    I have got a new job and the existing SBS 2003 domain setup is unsecure (i.e. everyone is a domain admin etc etc). There are lots of problem due to inexperienced "network admin", and I am trying to fix them one by one. There exist one issue which I found quite weird, that the "Quota" tab exists in the C:(NTFS) drive but not the D:(NTFS) drive. I played around with gpedit to enable disk quota (it was "not configured" before), but still I can't see that tab. Have you seen this problem before? How did you solve it?

    Read the article

  • How do I stop CRM from asking for admin credentials when printing report

    - by Ac0ua
    Problem: When printing a report in CRM, I am asked by the UAC for admin credentials. There is a long URL: https://(crm_hostname)/reserved.ReportViewerWebcontrol.axd?ReportSession=(ID)ControlID=(ID)Culture=1333&UICulture=1033&ReportStack=1&OpType=PrintCab Info: There are three users that have the issue laied out here. They are not admins of any kind. It looks like it is asking for permission to allow SQL Server Reporting Services 2008 to run. When I put my credentials in, it just brings up the Print dialog box (this is fine, just stop asking for credentials). I know this might sound silly but I downloaded and installed "SQL Server Reporting Services 2008" hopping my credentials would give permission right from the beginning. Giving the users local admin, I was told is not an option. Note: I did post this on community (dot) dynamics (dot) com but, they are not a very active website. Thanks for any help even if it just points me in the right direction!

    Read the article

  • Limit disk I/O one program creates?

    - by Posipiet
    Hardware: one virtualization server. Dual Nehalem, 24GB RAM, 2 TB mirrored HD. Software: Debian, KVM, virt-manager on the server with several virtual machines that use Linux too. 2 TB Disk is a big LVM, each VM gets a logical volume and makes its own partitions in that. Problem: One of the programs that runs on one of the VMs creates huge disk load. This never was an issue, because the program never ran on such a powerful hardware. Now the CPUs are fast, and lots of I/O is the result. We cant do much against that at the moment, because the tool is a black box. On the other hand, the speedy computation is welcome. The program creates about 5 GB of temp files which get overwritten during the next iteration. Question: How can we limit the disk I/O for the process?

    Read the article

  • solve a partition misalignment?

    - by learner
    I have a new Dell XPS laptop which had Windows 7 installed in it. It also had a default extra partition for "Dell Utility". I installed Ubuntu in it on an Extended Partition along with windows and specified the logical partitions myself (for /,/home and swap). Now when I open Disk Utility , it shows this "Partition misaligned by 512 bytes" error for the Dell Utility partition and "Partition misaligned by 1024 bytes" for the entire Extended partition where Ubuntu is installed. Deleting the extended partition and re-installing Ubuntu may solve the problem of misalignment in the extended partition. But how about the Dell Utility partition? If I re-install Windows 7 Dell Utility wouldn't be a part of the re-install. So that may not solve it either. How do I fix this? Note: The extended partition I made contains an NTFS logical partition for holding data accessible by both OSes(basically a personal data partition). EDIT: I deleted all my Ubuntu partitions and re-installed Ubuntu like before,this time making them partitions with GParted via LiveCD. Now the only problem is that there is a misalignment in the Dell Utility partition. The other misalignment got fixed. Now how do I get rid of that issue?

    Read the article

  • Issues with DHCP over multiple subnets

    - by Dan Monego
    I have a cisco router configured to handle multiple subnets (10.1.10.n, 10.2.10.n, etc), and an ubuntu system serving DHCP to the computers served by the router. After a restart of the DHCP, the systems on the 10.1 subnet are fine, but neither the server or the computers configured on the other subnets can see the router at 10.2.10.1 (or 10.3.10.1, or 10.4.10.1). The router can see itself at 10.2.10.1, however. The change that caused this was restarting the server, so I'm approaching it assuming the configuration error is on that end. Is that the likely issue or is there a different problem that would prevent the machines on the 10.2.10 subnet from resolving DHCP?

    Read the article

  • replacement drive cage for power edge R710

    - by bumble_bee_tuna
    Hi I'm performance tuning a DB server its a Dell R710 there is a very significant I/O bottleneck. Unfortunately the server was purchased with the 6 x 3.5 inch sata configuration which doesn't give me the leeway I need to address the issue. Before going to DAS does anyone know if it is possible to purchase a replacement front drive enclosure ? I know the server is configurable with something like 12 or 16 2.5 inch drives and it appears to be modular ? I tired contacting dell but the offshore parts department reps are not very bright lol. Thanks.

    Read the article

  • How can I make results of a formula values that can be filtered or use vlookup with Excel

    - by Burt
    I am having an issue in that I am using various formulas to move, split data, etc from various sources. The problem is when my final results post to the final destination that I want, I still need to either run advanced filters, or a vlookup with the results. I can’t do this because as an example if cell A1 shows a value of: A127 the actual cell content is: =RIGHT(A2,FIND(" ",A2&" ")-2) Everything I read said to copy and paste special values, but this doesn’t work for me as the idea is to have the formulas/macros run everything and eliminating cutting and pasting. In the case above I have a formula that pulls that info from a spreadsheet that is saved every week. Once it is pulled part of it is cut out in another column. I then need to run a vlookup on those results for data already contained on another tab.

    Read the article

  • How do I find out which process is eating up my bandwidth?

    - by Bruce Connor
    I think I'm being the victim of a bug here. Sometimes while I'm working (I still don't know why), my network traffic goes up to 200 KB/s and stays that way, even tough I'm not doing anything internet-related. This sometimes happens to me with the CPU usage. When it does, I just run a top command to find out which process is responsible and then kill it. Problem is: I have no way of knowing which process is responsible for my high network usage. Both the resource monitor and the top command only tell me my total network usage, neither of them tells me process specific network info. Is there another command I can use to find out which process is getting out of hand? I've already tried killing all the obvious ones (firefox, update-manager, pidgin, etc) with no luck. So far, restarting the machine is the only way I found of getting rid of the issue. EDIT: (just to be clear) I've found questions here about monitoring total bandwidth usage, but, as I mentioned, that's not what I need. UPDATE: The command iftop gives results that disagree entirely with the information reported by System Monitor. While the latter claims there's high network traffic, the former claims there's barely 1 KB/s. Thanks

    Read the article

< Previous Page | 644 645 646 647 648 649 650 651 652 653 654 655  | Next Page >