Search Results

Search found 39047 results on 1562 pages for 'process control'.

Page 957/1562 | < Previous Page | 953 954 955 956 957 958 959 960 961 962 963 964  | Next Page >

  • Unable to use Maya animation with scripts when imported to Unity

    - by keshk
    I am testing to import Maya animation over to Unity. I set up a simple cylinder with 2 bones and an IK handle. Made a simple animation where the cylinder bends and goes back to straight position over 24 frames. Following that, I selected everything and baked, all bones,ik,(animation by selecting all at the graph editor) and even the cylinder. I saved the scene and then select all and export as FBX with animation and bake checked. In unity imported it and at the preview able to see the animation. When I load the model into scene and play (after assigning the controller), able to see animation too. But now when I try to script it and control the animation, nothing happens. Even to test, I tried the following under the Update method. if(animation.isPlaying) Debug.Log("Animation Works"); else Debug.Log("Animation not working"); The bool doesn't even return true nor false. My animation is called "bend", thus just for try I did the following and nothing happens. animation.Play("bend"); Can please advice based on my steps, am I missing something. Do I need to add the controller or is that an unnecessary step? Did I screw up on the Maya part or the Unity part. Thanks for help.

    Read the article

  • Should I Use PHP as FastCGI?

    - by Synetech inc.
    Hi, I am running an Apache webserver on my Windows machine. It is not generally a public server (most of the little bit of traffic comes from the machine itself, and most of the public traffic comes from crawlers). Basically, it is mostly just for use as a test-bed, development system. I have read about how running PHP as FastCGI is better (ie faster and more stable) than as an Apache module. However, I really don’t like the idea of multiple PHP.exe processes (I don’t like that Apache has two processes and I’m not even too thrilled with Chromium’s multi-process model). So I’m wondering if it would be worthwhile to change PHP to FastCGI for this scenario. If it is, how would I configure it? Pretty much all of the information I have seen has been either for non-Windows or for IIS. As I said, I’m running Windows+Apache. Thanks a lot.

    Read the article

  • 2011 Tech Goal Review

    - by kerry
    A year ago I wrote a post listing my professional goals for 2011.  I thought I would review them and see how I did. Release an Android app to the marketplace – Didn’t do it.  In fact, haven’t really touched Android much since I wrote that.  I still have some ideas but am not sure if I will get around to it. Contribute free software to the community – I did do this.  I have been collaborating with others via github more lately. Regularly attend a user group meetings outside of Java – Did not do this.  Family life being what it is makes this not that much of a priority right now. Obtain the Oracle Certified Web Developer Certification – Did not do this.  This is not much of a priority to me any more. Learn scala – I am about 50/50 on this one.  I read a few scala books but did not write an actual application. Write an app using JSF – Did not do this.  Still interested. Present at a user group meeting – I did a Maven presentation at the Java user group. Use git more, and more effectively – Definitely did this.  Using it on a daily basis now. Overall, I got about halfway on my goals.  It’s not too bad since I did do a few things that weren’t on my list. Learned to develop applications using GWT and deploy them to Google App Engine Converted one of my sites from PHP to Ruby / Sinatra (learning to use it in the process) Studied up on the HTML 5 features and did a lot of Javascript development

    Read the article

  • How do I enable a disabled Event Notification.

    - by Derick Mayberry
    I have a scenerio where I am using external notification to process documents being sent in from the entire navy fleet, normally I have no problems, but just a few days ago an administrator changed passwords and I my queue processing failed and I rolled back the transaction with this C# code: catch (Exception) { TransporterService.WriteEventToWindowsLog(AppName, "Rolling Back Transaction:", ERROR); broker.Tran.Rollback(); break; } after which my target queue would continue to fill up but nothing to the external activation queue. Does the Event Notification get disabled once a transaction is rolled back? Should I have done a broker.EndDialog here when catching my exception? Also, after my event notification is disabled(if that is actually whats happening) how do I re engage it? Do I have to drop it and recreate it? Thank in advance for any help, I love Service Broker and its workign wonderfully except for this bug that I hope to fix soon.

    Read the article

  • Location Services are always disabled in Mac OS X Lion

    - by rplusg
    A simple location services program was working fine on my machine and suddenly stopped working. Upon further exploring the problem, I realized that some process has disabled location services in System Preferences » Security & Privacy » Privacy. I checked Enable Location Services, but again it got disabled automatically. After some research I found that it's not just my program, even built-in system functions are also failing because of this problem for example System Preferences » Date & Time » Time Zone failed to get the current location. Every time I check Enable Location Services, I see the following error in the console logs: 16/10/12 11:23:15.636 AM [0x0-0x42042].com.apple.systempreferences: ERROR,Time,372059595.636,Function,"CLInternalSetLocationServicesEnabled",CLInternalSetLocationServicesEnabled failed 16/10/12 11:23:15.638 AM [0x0-0x42042].com.apple.systempreferences: STACK,Time,372059595.636,1 CoreLocation 0x00007fff8f9957be CLInternalSetLocationServicesEnabled + 110 Notes: WiFi is on I didn't install iOS Simulator I use Xcode Version 4.5 (4G182) I use Boot Camp and made my MacBook Pro dual boot (Mac OS X Lion and Windows 7) I do only Mac development but not iOS

    Read the article

  • Split DC role from existing Exchange 2007 server

    - by Graeme Donaldson
    We currently have a single Exchange 2007 Server on Windows Server 2008. It's also a DC and I'd like to split the DC role to a different box. Is this doable without migrating the mailboxes off to a temporary box, re-installing and migrating back? I.e. can I just demote the server without breaking Exchange completely? I know this was quite painful with Server 2003/Exchange 2003, so I'm trying to get an idea of how much different the process is for Server 2008/Exchange 2007.

    Read the article

  • In search of database delivery practitioners and enthusiasts

    - by Claire Brooking
    We know from speaking with many of you at tradeshows and user groups that database delivery is not a factory production line. During planning, evaluation, quality control, and disaster mitigation, the people having their say at each step means that successful database deployment is a carefully managed course of action. With so many factors involved at every stage, we would love to find a way for our software to help out, by simplifying processes, speeding them up or joining together the people and the steps that make it all happen. We’re hoping our new research group for database delivery (SQL Server and Oracle) will help us understand the views and experiences of those of you out there in the trenches managing database changes. As part of our new group, we’ll be running a variety of research sessions, including surveys and phone interviews, over coming months. If you have opinions to share on Continuous Integration or Continuous Delivery for databases, we’d love to hear from you. Your feedback really will count as the product teams at Red Gate build plans. For some of our more in-depth sessions, we’ll also be offering participants an Amazon voucher as a thank-you for your time. If you’re not yet practising automated database deployment processes, but are contemplating or planning it, please do consider joining our research group too. If you’d like to sign up to the group and find out more, please fill in a quick form online, and we’ll be in touch to let you know about new research opportunities you might be interested in. We look forward to hearing your stories!

    Read the article

  • links for 2011-01-03

    - by Bob Rhubart
    Using Solaris zfs + iscsi targets with Oracle VM (Wim Coekaerts Blog) "I was playing with my Oracle VM setup and needed some shared storage that was block based. I did not have a storage array available but I did have a Solaris box, that I use for Oracle VDI, available." - Wim Coekaerts (tags: oracle otn solaris oraclevm virtualization) DanT's GridBlog: Oracle Grid Engine: Changes for a Bright Future at Oracle "Today, we are entering a new chapter in Oracle Grid Engine’s life. Oracle has been working with key members of the open source community to pass on the torch for maintaining the open source code base to the Open Grid Scheduler project hosted on SourceForge." - Dan Templeton (tags: oracle gridengine) Oracle Fusion Middleware Security: How do I secure my services? "I've been up early for a couple of days talking to a customer about how they should secure their services,' says Chris Johnson. "I'm going to tell you what I told them." (tags: oracle fusionmiddleware security) OldSpice your Innovation - Dangers of Status Quo E2.0 | Enterprise 2.0 Blogs "If organizations only leverage E2.0 technologies in a 'me too' fashion, they are essentially using a bucket to bail water from a leaking ship." - John Brunswick (tags: oracle enteprise2.0) The Aquarium: GlassFish in 2011 - What to expect A look into the Glassfish crystal ball... (tags: oracle glassfish) Andrejus Baranovskis's Blog: Fusion Middleware 11g Security - Retrieve Security Groups from ADF 11g Oracle ACE Director Andrejus Baranovskis shows you what to do when you need to access security information directly from an ADF 11g application. (tags: oracle otn fusionmiddleware security adf) @eelzinga: Book review : Oracle SOA Suite 11g R1 Developer's Guide "What I really liked in this book...was the compare/description of the Oracle Service Bus. The authors did a great job on describing functionality of components existing in the SOA Suite and how to model them in your own process." - Oracle ACE Eric ElZinga (tags: oracle oracleace soa bookreview soasuite)

    Read the article

  • Can not login Dashboard / Unable to find the server at mykeystoneurl

    - by neo0
    I installed Dashboard following this guide: http://wiki.openstack.org/OpenStackDashboard Everything fine, but when I run the server, I can not login with the username and password in DATABASE config in local_settings.py. Here's my config: DATABASES = { 'default': { 'ENGINE': 'django.db.backends.mysql', 'NAME': 'dashboarddb', 'USER': 'nova', 'PASSWORD': 'nova', 'HOST': 'localhost', 'default-character-set': 'utf8' }, } When I run the Dashboard server and enter username + password. It returned this error on browser: Unable to find the server at mykeystoneurl (HTTP 400) And in the command line: DEBUG:openstack_dashboard.settings:Running in debug mode without debug_toolbar. DEBUG:openstack_dashboard.settings:Running in debug mode without debug_toolbar. Validating models... 0 errors found Django version 1.3.1, using settings 'openstack_dashboard.settings' Development server is running at http://0.0.0.0:8888/ Quit the server with CONTROL-C. Request returned failure status. Traceback (most recent call last): File "/home/us/horizon/.venv/src/python-keystoneclient/keystoneclient/client.py", line 121, in request body = json.loads(body) File "/usr/lib/python2.7/json/__init__.py", line 326, in loads return _default_decoder.decode(s) File "/usr/lib/python2.7/json/decoder.py", line 366, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode raise ValueError("No JSON object could be decoded") ValueError: No JSON object could be decoded [06/Mar/2012 15:20:03] "POST /auth/login/ HTTP/1.1" 200 3735 I also tried login as "admin" with password is "password" or "secrete" but I didn't work. What's wrong? Thank you!

    Read the article

  • Are there any concrete examples of where a paralellizing compiler would provide a value-adding benefit?

    - by jamie
    Paul Graham argues that: It would be great if a startup could give us something of the old Moore's Law back, by writing software that could make a large number of CPUs look to the developer like one very fast CPU. ... The most ambitious is to try to do it automatically: to write a compiler that will parallelize our code for us. There's a name for this compiler, the sufficiently smart compiler, and it is a byword for impossibility. But is it really impossible? Can someone provide a concrete example where a paralellizing compiler would solve a pain point? Web-apps don't appear to be a problem: just run a bunch of Node processes. Real-time raytracing isn't a problem: the programmers are writing multi-threaded, SIMD assembly language quite happily (indeed, some might complain if we make it easier!). The holy grail is to be able to accelerate any program, be it MySQL, Garage Band, or Quicken. I'm looking for a middle ground: is there a real-world problem that you have experienced where a "smart-enough" compiler would have provided a real benefit, i.e that someone would pay for? A good answer is one where there is a process where the computer runs at 100% CPU on a single core for a painful period of time. That time might be 10 seconds, if the task is meant to be quick. It might be 500ms if the task is meant to be interactive. It might be 10 hours. Please describe such a problem. Really, that's all I'm looking for: candidate areas for further investigation. (Hence, raytracing is off the list because all the low-hanging fruit have been feasted upon.) I am not interested in why it cannot be done. There are a million people willing to point to the sound reasons why it cannot be done. Such answers are not useful.

    Read the article

  • What are the benefits of running a app server in user space, like Unicorn, as opposed to as sudo?

    - by dan
    I've been using Phusion Passenger + Rails/Sinatra for a lot of projects. Passenger runs under the main Nginx or Apache process. But I'm interested in Unicorn, partly because it runs in user space. You just set up Nginx to proxy_pass requests to a unix socket that is connected to Unicorn processes that you fire up under a normal user account. Is there anything to be said as far as advantages and disadvantages of these two alternative approaches to running an web app? I mean in terms of ease of administration, stability, simplicity, etc.

    Read the article

  • Google analytics - drop in traffic

    - by user1001421
    Bit of a general question here. We are in the process of converting a number of our clients from older web sites to new ones. The problem we are getting, and sorry for being so general here, is we are getting a sharp decline in traffic as reported on Google Analytics. It's not a gradual decline, it seems to hit almost as soon as the new site goes live. I've just got a few questions to see if there is something we are doing wrong: a) We are using the same analytics accounts going from old to new site. Is this a bad idea? b) The actual analytics code is integrated into the pages using a server-side include. IS this a bad idea? c) We structure our sites differently to our old site. IE. The old sites would pretty must have all the web pages in the root directory, and hyperlinks would be linked to the page files: EG. <a href="somepage.aspx">Link</a> Our new sites now have a directory structure that pretty much reflects the navigation structure, and hyper links link to the pages directory instead of the actual page: EG. <a href="/new-items/shoes/">New shoes</a> Is this a bad idea. I'm really searching for a needle in a haystack here. Would appriciate any help or advice as to why we are getting such a sharp and sudden drop in traffic. Again, so this is such a general question. Thanks in advance.

    Read the article

  • Inspiring the method of teaching. Example- C++ :)

    - by Ashwin
    A year ago I graduated with a degree in Computer Science and Engineering. Considering C++ as the first choice of programming language I have been in the process of learning C++ in many ways. At first - five years back - I had many conceptions, most of which were so abstract to me. It started when I knew almost everything about Structs in C and nothing about Classes in C++. I went through a great time experimenting them all and learning a lot. I had a hard time evaluating Procedural programming vs Object-Oriented Programming. Deciding when to choose Procedural or Object-Oriented Programming took a great deal of patience for me. I knew that I cannot underestimate any of these Programming styles... Though Procedural programming is often a better choice than simple sequential unstructured programming, when solving problems with procedural programming, we usually divide one problem into several steps in order regarded as functions. Then we call these functions one by one to get the result of the problem. When solving problems with Object Oriented Priciples we divide one problem into several classes and form the interaction between them. Evaluating these two at the beginning (as a learner) required a lot of inspiration and thoughts. Instructing to think step by step. Relative concepts to understand deeply. Intensive interests to contrast both solving in both POP and OOP. If you were ever a mentor: What ideas/methods would you teach to students in which it will Inspire them to learn a programming language (in general, computer sciences)?

    Read the article

  • Windows 8 Automatic Logon Tick Box Missing

    - by Luke Kenny
    Recently (in the past few days,) perhaps following the latest Windows Update, it appears the tick box to allow automatic logon in "control userpasswords2" or "netplwiz" has disappeared. I have two machines running Windows 8 and the option is no longer available for either. Both machines user a Microsoft account, rather than a local account, for the primary user to logon. The only other recent change I can think of, and I am confident this change was made well before this issue arose, was enabling HomeGroup. How can I re-enable automatic logon for the affected user?

    Read the article

  • needs updated glibc package version 3.4.15 or later for RHEL6

    - by Tejas
    I want to upgrade my current running applications to latest version. But due to some package issue i am unable to install them. I get common error in that: /usr/lib64/libstdc++.so.6: version 'GLIBCXX_3.4.15' not found. When i tried to update glibc package i get following output: [root@agastya ~]# yum install glibc Loaded plugins: refresh-packagekit, rhnplugin epel/metalink | 3.8 kB 00:00 epel | 4.3 kB 00:00 epel/primary_db | 5.0 MB 01:33 epel-testing/metalink | 3.8 kB 00:00 epel-testing | 4.3 kB 00:00 epel-testing/primary_db | 295 kB 00:03 rhel-x86_64-server-6 | 1.8 kB 00:00 rhel-x86_64-server-6/primary | 11 MB 02:02 rhel-x86_64-server-6 8816/8816 Setting up Install Process Package glibc-2.12-1.80.el6_3.6.x86_64 already installed and latest version Nothing to do [root@agastya ~]# Should i need to add some more repositories? If yes, how?

    Read the article

  • Configuring Samba to allow Use of CUPS printer

    - by Skizz
    Having trouble with samba printing. I have a CUPS printer installed on an Ubuntu 11.04 server and that works great. When I try to configure samba to allow an XP machine to use the printer, it fails when printing. I can install the printer drivers for XP from the server and the printer appears in the XP printer control panels. When I try to print a test page from the XP machine I get this error in the system event log: Jun 27 20:33:29 FatController smbd[3571]: [2012/06/27 20:33:29, 0] rpc_server/srv_netlog_nt.c:603(_netr_ServerAuthenticate3) Jun 27 20:33:29 FatController smbd[3571]: _netr_ServerAuthenticate3: netlogon_creds_server_check failed. Rejecting auth request from client JAMES machine account JAMES$ Here's my smb.conf file: [global] server string = %h (Server) workgroup = SODOR encrypt passwords = true security = user os level = 255 preferred master = yes domain master = yes local master = yes logon path = \\%L\profile\%U logon drive = S: logon home = \\%L\home\%U domain logons = yes map to guest = Never guest ok = no dns proxy = no time server = yes logon script = logon.bat load printers = yes printing = cups printcap name = cups nt acl support = no interfaces = eth1 lo bind interfaces only = yes smb ports = 445 [netlogon] comment = Net Log On path = /home/samba/netlogon guest ok = no read only = yes browseable = no [profile] comment = User Profiles path = /home/samba/profiles read only = no create mask = 0600 directory mask = 0700 browseable = no store dos attributes = yes [printers] comment = All Printers path = /var/spool/samba browseable = yes guest ok = no printable = yes [print$] comment = Printer Drivers path = /var/lib/samba/printers browseable = yes guest ok = no read only = yes write list = root, skizz Anyone know what the problem is and how to fix it? In addition to the above, I also get this error: Jun 27 21:56:35 FatController smbd[3571]: [2012/06/27 21:56:35, 0] printing/print_cups.c:1027(cups_job_submit) Jun 27 21:56:35 FatController smbd[3571]: Unable to print file to `Edward' - client-error-not-authorized which I think is more relevant.

    Read the article

  • IBM laptop remove access-connections

    - by Kevin
    I installed IBM Access Connections on my IBM t61 laptop to manage my wireless connections but it simply will not connect to my network. I want to uninstall it and try the Intel software which is capable to performing the same task. However, when I go into "Control Panel" - "Add/Remove Programs" and try to "Remove" this package, it simply opens up a screen and closes it immediately. This happens too fast for me to see whether its an error. Has anyone encountered this issue ? The event logs do not show any error.

    Read the article

  • Should I start MCPD training now or wait for new exams?

    - by lunchmeat317
    i apologize if this question has been asked before, or if this is the wrong place to put it. I'm beginning my study track for the MCPD certification in Web Development. However, Microsoft plans to retire this certification on July 31st of 2013, along with two of the necessary tests to receive the certification. On MS's site, I can't find a newer certification path to take - I imagine that Microsoft will release new certification paths and new tests for their new software, but I don't know when that will happen. I don't really know anything about Microsoft's process, as this is the first Microsoft certification I'll be studying for. The bottom line is this - I don't want to lose six months waiting for a new test to appear that won't expire, but I don't want to rush to get a certification that will be invalid in six months (or have to reset any progress due to new study material). To those with experience in affairs like this - what is the best course to take, and can I maximize the time I have now (not wait for new testing material)? Is there any way to find material for the new tests that Microsoft will be rolling out? Thank you for your patience. If this is the wrong place to put this question, I would like to request that it be moved to the correct StackExchange site instead of being closed. Thanks for your help!

    Read the article

  • Should I cache the data or hit the database?

    - by JD01
    I have not worked with any caching mechanisms and was wondering what my options are in the .net world for the following scenario. We basically have a a REST Service where the user passes an ID of a Category (think folder) and this category may have lots of sub categories and each of the sub categories could have 1000 of media containers (think file reference objects) which contain information about a file that may be on a NAS or SAN server (files are videos in this case). The relationship between these categories is stored in a database together with some permission rules and meta data about the sub categories. So from a UI perspective we have a lazy loaded tree control which is driven by the user by clicking on each sub folder (think of Windows explorer). Once they come to a URL of the video file, they then can watch the video. The number of users could grow into the 1000s and the sub categories and videos could be in the 10000s as the system grows. The question is should we carry on the way it is currently working where each request hits the database or should we think about caching the data? We are on using IIS 6/7 and Asp.net.

    Read the article

  • LCD monitor reports incorrect maximum resolution

    - by SLaks
    I have four 20" Planar 2010M LCD monitors with a maximum resolution of 1600 x 1200 connected to two nVidia video cards (8600 GT and 7600 GS). I'm running Windows Server 2003 x86. Recently, two of the monitors have started mis-reporting their maximum resolution as 1280 x 1024. When this first happened, I used nVidia's Custom Resolutions feature to force the monitors back to 1600 x 1200. Yesterday, however, I upgraded nVidia's video card driver, and ever since, I cannot get the DVI one back to 1600 x 1200. When I add the custom resolution in nVidia's control panel, if I set either the width or the height to even a single pixel more than 1280 x 1024, nothing changes when I click Test (the monitor doesn't even flash black, although after 15 seconds, it flashes black and doesn't change). After adding Does anyone know what the problem is? Is there anything I can do about it?

    Read the article

  • How to Transfer All Your Information to a New PS3

    - by Justin Garrison
    The PlayStation 3 now costs half the price, has double the storage, and uses half the power. If you need another reason to upgrade, Sony also makes it easy to transfer all of your information to a new console. Transferring all of your games, data, and settings is easier than ever, and all you need is an ethernet cable. Read on as we walk you through the whole process of setting up your new PS3 and wiping all your information off the old one. Latest Features How-To Geek ETC Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions Hack Apart a Highlighter to Create UV-Reactive Flowers [Science] Add a “Textmate Style” Lightweight Text Editor with Dropbox Syncing to Chrome and Iron Is the Forcefield Really On or Not? [Star Wars Parody Video] Google Updates Picasa Web Albums; Emphasis on Sharing and Showcasing Uwall.tv Turns YouTube into a Video Jukebox Early Morning Sunrise at the Beach Wallpaper

    Read the article

  • How to build the mainline kernel source package?

    - by Maxime R.
    Ubuntu kernel PPA only provides linux-headers*.deb and linux-image*.deb packages. How can I build the corresponding linux-source*.deb package ? Context: I'm currently running Ubuntu 11.10 with the mainline kernel (3.2 rc6 now) to get a better support for my sandybridge IGP (Dell E6420 laptop with intel i5-2520M CPU). Appears, i'd like to install this touchpad driver, ALPS touchpads being badly supported (see previous link bug report), while waiting for upstream support in kernel version 3.3. Problem is, DKMS keeps complaining about not finding the full kernel source: Module build for the currently running kernel was skipped since the kernel source for this kernel does not seem to be installed. Appears I may not need the full source but I'd still like to try having it installed to see if it solve my problem. What I tried : Uncompressing the kernel.org source archive in /usr/src/. DKMS still complaining. Manually updating the kernel source package with uupdate and the mainline source package like explained here. Did not succeed. Manually building the linux-source package following @roadmr and @elmicha instructions. I eventually succeeded to build it but DKMS still complained about the missing source. At last I noticed an error I did not catch in the first place while reinstalling the kernel headers. Appears the .deb I got may have been corrupted, downloading it again did the trick :) Alas, while DKMS agreed to compile the module i ran into the following error which appears to have already been reported. This issue isn't yet solved but I won't try to because of the following: in the end I decided to test the precise kernel version 3.2-rc6 through the xorg-edgers ppa which appears to be correctly patched: it works. Nevertheless, it might still be of some interest to know how to build the mainline linux-source package as the Ubuntu Kernel Team doesn't provide it. Not to mention that I learned a lot in the process ^^

    Read the article

  • Why CFOs Should Care About Big Data

    - by jmorourke
    The topic of “big data” clearly has reached a tipping point in 2012.  With plenty of coverage over the past few years in the IT press, we are now starting to see the topic of “big data” covered in mainstream business press, including a cover story in the October 2012 issue of the Harvard Business Review.  To help customers understand the challenges of managing “big data” as well as the opportunities that can be created by leveraging “big data”, Oracle has recently run and published the results of a customer survey, as well as white papers and articles on this topic.  Most recently, we commissioned a white paper titled “Mastering Big Data: CFO Strategies to Transform Insight into Opportunity”. The premise here is that “big data” is not just a topic that CIOs should pay attention to, but one that CFOs should understand and take advantage of as well.  Clearly, whoever masters the art and science of big data will be positioned for competitive advantage in their industries or markets.  That’s why smart CFOs are taking control of big data and business analytics projects, not just to uncover new ways to drive growth in a slowing global economy, but also to be a catalyst for change in the enterprise.  With an increasing number of CFOs now responsible for overseeing IT investments and providing strategic insight to the board, CFOs will be increasingly called upon to take a leadership role in assessing the value of “big data” initiatives, building on their traditional skills in reporting and helping managers analyze data to support decision making. Here’s a link to the white paper referenced above, which is posted on the Oracle C-Central/CFO web site, as well as some other resources that can help CFOs master the topic of “big data”: White Paper “Mastering Big Data:  CFO Strategies to Transform Insight into Opportunity CFO Market Watch article:  “Does Big Data Affect the CFO?” Oracle Survey Report:  “From Overload to Impact – An Industry Scorecard on Big Data Industry Challenges” Upcoming Big Data Webcast with Andrew McAfee Here’s a general link to Oracle C-Central/CFO in case you want to start there: www.oracle.com/c-central/cfo Feel free to contact me if you have any questions or need additional information:  [email protected]

    Read the article

  • Need to run a .sh as root on boot or login

    - by Graymayre
    Still new with linux and running ubuntu 12.10 I have a wireless stick (ae2500) which has known issues that has been partially solved using ndiswrapper. However, to use it I must run the same scripts every time I reboot, effectively uninstalling and reinstalling the driver. I made a .sh file to run every time to make it easy, but I must do the sudo login everytime. There are three solutions I am looking for and although not all are necessary to solve this particular problem, I would still like to know them all for learning purposes. run scripts or file.sh on boot (as well as other programs) run scripts or file.sh automatically with root privileges make the install permanent so as not to have to go through the process every time. Any additional information that can help me regarding this that I did not think to ask (including streamlining my commands), or general knowledge, would be greatly appreciated. Following are the contents of the file. I pretty much just made it as I would have entered it. cd ~/ndiswrapper-1.58rc1 sudo modprobe -rf ndiswrapper sudo rm /etc/modprobe.d/ndiswrapper.conf sudo rm -r /etc/ndiswrapper/* sudo depmod -a sudo make uninstall sudo make sudo make install sudo ndiswrapper -i bcmwlhigh5.inf ndiswrapper -l sudo modprobe ndiswrapper

    Read the article

  • How to achieve reliable Gigabit Ethernet Link with my Acer Aspire Revo R3610?

    - by The Operator
    I want to stream HD movies over my wired Gigabit LAN from my PC to my Acer Aspire Revo R3610. It's connected with a 3ft Cat5e patch cable to my Netgear GS605v2 Switch. The PC acting as File Server is connected at 1Gbps to the Switch. Network driver options are set to defaults, including automatic speed/duplex negotiation on both machines. The Revo will not connect to my Network Switch at 1Gbps - the OS reports that it reverts to 100Mbps either shortly after connection or immediately upon connection. Through a process of elimination (trying different drivers, patch cables, ports on the switch, and other 1Gbps-capable devices connected to the Network switch which successfully achieve 1Gbps links and performance) I have drawn the conclusion there is either a Hardware or Software (Driver) issue with the Revo itself. I have performed tests using Windows 7 and Ubuntu 9.10. Can anyone offer insight on Gigabit Ethernet with the Revo?

    Read the article

< Previous Page | 953 954 955 956 957 958 959 960 961 962 963 964  | Next Page >