Search Results

Search found 709 results on 29 pages for 'aaron stewart'.

Page 4/29 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Profit's COLLABORATE 10 Session Selections

    - by Aaron Lazenby
    COLLABORATE 2010 is a mere 11 days away (thanks for the reminder @ocp_advisor). Every year I publish my a list of the sessions I think reflect some of the more interesting people/trends in enterprise IT. I should be at all of these sessions, so drop by for a chat--I'll be the guy tapping out emails on my iPad... Monday, April 19 9:15 a.m. - Keynote: Transforming Customer Value, Delivering Highest Customer Service Location: Keynote Hall I never miss Charles Phillips when he speaks--it's one of the best opportunities to get an update on Oracle product developments and strategy. And there's certainly occasion for an update: this will be Phillips' first big presentation since the Oracle + Sun Strategy Update in late January. Phillips is appearing with Oracle Executive Vice President of Development Thomas Kurian which means there should be some excellent information about how customers are using Oracle's complete software and hardware stack to address enterprise IT challenges. The session should provide some excellent context for the rest of the week's session...don't miss it. 10:45 a.m. - Oracle Fusion Applications: Functional Overview Location: South Seas FI met Basheer Khan at COLLABORATE 08 in Denver and have followed his work ever since. He's a former member of the OAUG Board of Directors, an Oracle ACE, and a charismatic enterprise IT expert. Having worked with the Oracle Usability Advisory Board, Basheer should have some fascinating insights to share about the features and interface of Oracle's Fusine Applications. This session, along with Nadia Bendjedou's "10 Things You Can Do Today to Prepare for the Next Generation Applications" (on Tuesday, April 20 8:00 a.m. in room 3662) should give attendees the update they need about Oracle's next-generation applications.   1:15p.m. - E-Business Suite in the Amazon Cloud Location: South Seas HI did my first full-fledged cloud computing coverage at last year's COLLABORATE show (check out my interview with Oracle's Bill Hodak), where I first learned about Amazon's EC2 offering. I've since talked with several people who have provisioned server space on Amazon's cloud with great results. So I'm looking forward to watching the audience configure an instance of the Oracle E-Business Suite release 12 on the cloud while Chuck Edwards from Blue Gecko drives. This session should take some of the mist and vapor out of the cloud conversation.2:30 p.m. - "Zero Sign-on" to EBS - Enabling 96000 Users to Login to EBS Without User Maintenance Location: South Seas HI'll be sitting tight in South Seas H for the next session on Monday where Doug Pepka, a ten-year veteran of communications giant Comcast, will be walking attendees through a massive single sign-on (SSO) project across the enterprise. I'm working on a story about SSO for the August issue of Profit, so this session has real practical value to me. Plus the proliferation of user account logins--both personal and professional--makes this a critical usability/change management issue for IT leaders planning for successful long-term IT implementations.   Tuesday 8:00 am  - Information Architecture for Men in Kilts Location: SURF AGetting to a 8:00 a.m. presentation is a tall order in Las Vegas, but presenter Billy Cripe will make it worth your effort. Not only is the title of this session great, but the content should appeal to any IT strategist looking to push the limits of Web 2.0 technologies in the enterprise. Cripe is a product management director of Enterprise 2.0 and Enterprise Content Management at Oracle, author of Reshaping Your Business with Web 2.0, and a prolific blogger--he knows how information architecture is critical to and enterprise 2.0 implementation.    10:30a.m. - Oracle Virtualization: From Desktop to Data Center Location: REEF FData center virtualization is still one of the best ways to reduce the cost of running enterprise IT. With the addition of Sun products, Oracle has the industry's most comprehensive virtualization portfolio. I must admit, I'm no expert in this subject. So I'm looking forward to Monica Kumar's presentation so I can get up to speed.   Wednesday 8:00 a.m. - The Art of the Steal Location: Mandalay Bay Ballroom JMany will know Frank Abagnale from Steven Spielberg's 2002 film "Catch Me if You Can." The one-time con man and international fugitive who swindled $2.5 million in forged checks went on to help U.S. federal officials investigate fraud cases. Now the CEO of Abagnale and Associates, he has become an invaluable source to the business world on the subject of fraud and fraud protection. With identity theft and digital fraud still on the rise, this session should be an entertaining, and sobering, education on the threats facing businesses and customers around the world. A great way to start Wednesday.1:00 p.m. - Google Wave: Will it replace e-mail as we know it today? Location: SURF EBy many assessments (my own included), Google Wave is a bit of an open collaboration failure. It may seem like an odd reason for me to be excited about this session, but I'm looking forward to the chance to revisit the technology. Also, this is a great case study in connecting free, available Internet tools to existing enterprise computing environments--an issue that IT strategists must contend with as workers spreads out and choose their own productivity tools.  

    Read the article

  • USB install from second internal hard drive in a MacBook Pro

    - by aaron.anderson
    I am trying to install Ubuntu (among others) on a second internal hard drive on my MacBook Pro. I have an 80 GB internal SSD with OS X on it, along with a 750 GB internal HD with a few partitions, one of which being for Ubuntu. I currently have rEFInd installed for switching between the OS's. I was wondering how one would go about installing Ubuntu from the USB install stick. I have followed the instructions on creating a bootable USB. Once this is bootable, could I just hold the Option key on startup, and it should appear in the menu? Or am I missing something?

    Read the article

  • Apache VERY high page load time

    - by Aaron Waller
    My Drupal 6 site has been running smoothly for years but recently has experienced intermittent periods of extreme slowness (10-60 sec page loads). Several hours of slowness followed by hours of normal (4-6 sec) page loads. The page always loads with no error, just sometimes takes forever. My setup: Windows Server 2003 Apache/2.2.15 (Win32) Jrun/4.0 PHP 5 MySql 5.1 Drupal 6 Cold fusion 9 Vmware virtual environment DMZ behind a corporate firewall Traffic: 1-3 hits/sec avg Troubleshooting No applicable errors in apache error log No errors in drupal event log Drupal devel module shows 242 queries in 366.23 milliseconds,page execution time 2069.62 ms. (So it looks like queries and php scripts are not the problem) NO unusually high CPU, memory, or disk IO Cold fusion apps, and other static pages outside of drupal also load slow webpagetest.org test shows very high time-to-first-byte The problem seems to be with Apache responding to requests, but previously I've only seen this behavior under 100% cpu load. Judging solely by resource monitoring, it looks as though very little is going on. Here is the kicker - roughly half of the site's access comes from our LAN, but if I disable the firewall rule and block access from outside of our network, internal (LAN) access (1000+ devices) is speedy. But as soon as outside access is restored the site is crippled. Apache config? Crawlers/bots? Attackers? I'm at the end of my rope, where should I be looking to determine where the problem lies?

    Read the article

  • Recommendations on eReader for technical reference material

    - by Aaron Kowall
    I’ve been thinking that an eBook reader would be handy since I travel a lot.  I’m not really all that worried about taking novels and pleasure reading as much as taking along work related books and reference material. I haven’t really done a lot of research into the various options (Sony, Kindle, Nook, iPad, etc.) but am aware that not all content can be read on all readers even if it is in ePub format due to DRM. Anybody got a recommendation on which device/store combination offers the best selection of technical reference for a .Net developer with a particular interest in software process engineering? HELP!! Technorati Tags: eBook,eReader,iPad,Kindle,Nook,Sony,ePub,PDF

    Read the article

  • Make sure you take advantage of Visual Studio 2010 license offers before April 12th

    - by Aaron Kowall
    For Visual Studio 2010 Professional Microsoft Visual Studio 2010 Professional will launch on April 12 but you can beat the rush and secure your copy today by pre-ordering at the affordable estimated retail price of $549, a saving of $250. If you use a previous version of Visual Studio or any other development tool then you are eligible for this upgrade. Along with all the great new features in Visual Studio 2010 (see www.microsoft.com/visualstudio) Visual Studio 2010 Professional includes a 12-month MSDN Essentials subscription which gives you access to core Microsoft platforms: Windows 7 Ultimate, Windows Server 2008 R2 Enterprise, and Microsoft SQL Server 2008 R2 Datacenter. So visit http://www.microsoft.com/visualstudio/en-us/pre-order-visual-studio-2010 to check out all the new features and sign up for this great offer. Ultimate Offer for MSDN Subscribers Also, don’t forget about the Ultimate Offer which basically gives you the opportunity to use a higher end Visual Studio SKU for the duration of your MSDN agreement.  Make sure you review your MSDN license BEFORE APRIL 12th to make sure you are in the right spot to maximize this benefit. Technorati Tags: VS 2010

    Read the article

  • I&rsquo;m IN!

    - by Aaron Kowall
    Got an email this morning.(yes I checked and it wasn’t an April Fools joke) congratulating me on becoming a Microsoft MVP. I’ve been working with and among MVP’s for quite a while and quite frankly felt left out.  Well, I’m finally part of the crowd. I received an MVP for Visual Studio ALM.  This makes me VERY proud as I know the high caliber of the existing Visual Studio ALM MVP’s and am honored to now count myself among them. Now my challenge is to make sure I continue to do those things that got me nominated so that I can retain this honor. Technorati Tags: MVP,VS2010,ALM

    Read the article

  • Preview - Profit, May 2010

    - by Aaron Lazenby
    Whew! Last Friday, we put the finishing touches on the May 2010 edition of Profit, Oracle's quarterly business and technology journal. The issue will be back from the printer and live on the website in mid-April. Here's a preview: 0 0 0 Turning Crisis into OpportunityDuring the depths of the financial crisis, San Francisco California-based Wells Fargo &Company launched a bold acquisition of Wachovia Bank--one of the largest financial services mergers in history. Learn how Oracle software helped Wells Fargo CFO Howard Atkins prepare his office for the merger--and assisted with the integration of the companies once the deal was done.Building on SuccessGlobal construction firm Hill International takes project management to new heightswith Oracle's Primavera solutions.?Product Management, In Black and whiteCatch up with Zebra Technologies to see how Oracle's Agile applications connectwith an existing Oracle E-Business Suite system. A Perfect MatchLearn how technology makes good medicine in this interview with National MarrowDonor Program CIO Michael Jones. The IT Ties the BindHow information systems are help­ing manage knowledge workers in a post-9-to-5work world.I'll post a link to the new edition once it's live. Hope you enjoy!

    Read the article

  • Oracle's CFO Summit: Live Updates Tomorrow

    - by Aaron Lazenby
    Leaving tonight for Oracle's CFO Summit in Atlanta, GA. Will be sending live tweets out over @OracleProfit with updates of the proceedings. Economist Martin Neil Baily will be presenting information about the state of the economy, as will prominent Oracle executives and members of the financial services sector. Should be an informative day--look for updates here and on Twitter. 

    Read the article

  • Imaginet Resources acquires Notion Solutions

    - by Aaron Kowall
    Huge news for my company and me especially. http://www.imaginets.com/news--events/imaginet_acquisition_notion.html With the acquisition we become a very significant player in the Microsoft ALM space.  This increases our scale significantly and also our knowledgebase.  We now have a 2 Regional Directors and a pile of MS MVP’s. The timing couldn’t be more perfect since the launch of Visual Studio 2010 and Team Foundation Server 2010 is TODAY!! Oh, and we aren’t done with announcements today… More later. Technorati Tags: VS 2010,TFS 2010,Notion,Imaginet

    Read the article

  • Architecture advice for converting biz app from old school to new school?

    - by Aaron Anodide
    I've got a WinForms business application that evolved over the past few years. It's forms over data with a number custom UI experiences taylored to the business, so I don't think it's a candidate to port to something like SharePoint or re-write in LightSwitch (at least not without significant investment). When I started it in 2009 I was new to this type of development (coming from more low level programming and my RDBMS knowledge was just slightly greater than what I got from school). Thus, when I was confronted with a business model that operates on a strict monthly accounting cycle, I made the unfortunate decision to create a separate database for each accounting period. Also, when I started I knew DataSets, then I learned Linq2Sql, then I learned EntityFramework. The screens are a mix and match of those. Now, after a few years developing this thing by myself I've finally got a small team. Ultimately, I want a web front end (for remote access to more straight up screens with grids of data) and a thick client (for the highly customized interfaces). My question is: can you offer me some broad strokes architecture advice that will help me formulate a battle plan to convert over to a single database and lay the foundations for my future goals at the same time? Here's a screen shot showing how an older screen uses DataSets and a newer screen uses EF (I'm thinking this might make it more real for someone reading the question - I'm willing to add any amount of detail if someone is willing to help).

    Read the article

  • Design Pattern for Complex Data Modeling

    - by Aaron Hayman
    I'm developing a program that has a SQL database as a backing store. As a very broad description, the program itself allows a user to generate records in any number of user-defined tables and make connections between them. As for specs: Any record generated must be able to be connected to any other record in any other user table (excluding itself...the record, not the table). These "connections" are directional, and the list of connections a record has is user ordered. Moreover, a record must "know" of connections made from it to others as well as connections made to it from others. The connections are kind of the point of this program, so there is a strong possibility that the number of connections made is very high, especially if the user is using the software as intended. A record's field can also include aggregate information from it's connections (like obtaining average, sum, etc) that must be updated on change from another record it's connected to. To conserve memory, only relevant information must be loaded at any one time (can't load the entire database in memory at load and go from there). I cannot assume the backing store is local. Right now it is, but eventually this program will include syncing to a remote db. Neither the user tables, connections or records are known at design time as they are user generated. I've spent a lot of time trying to figure out how to design the backing store and the object model to best fit these specs. In my first design attempt on this, I had one object managing all a table's records and connections. I attempted this first because it kept the memory footprint smaller (records and connections were simple dicts), but maintaining aggregate and link information between tables became....onerous (ie...a huge spaghettified mess). Tracing dependencies using this method almost became impossible. Instead, I've settled on a distributed graph model where each record and connection is 'aware' of what's around it by managing it own data and connections to other records. Doing this increases my memory footprint but also let me create a faulting system so connections/records aren't loaded into memory until they're needed. It's also much easier to code: trace dependencies, eliminate cycling recursive updates, etc. My biggest problem is storing/loading the connections. I'm not happy with any of my current solutions/ideas so I wanted to ask and see if anybody else has any ideas of how this should be structured. Connections are fairly simple. They contain: fromRecordID, fromTableID, fromRecordOrder, toRecordID, toTableID, toRecordOrder. Here's what I've come up with so far: Store all the connections in one big table. If I do this, either I load all connections at once (one big db call) or make a call every time a user table is loaded. The big issue here: the size of the connections table has the potential to be huge, and I'm afraid it would slow things down. Store in separate tables all the outgoing connections for each user table. This is probably the worst idea I've had. Now my connections are 'spread out' over multiple tables (one for each user table), which means I have to make a separate DB called to each table (or make a huge join) just to find all the incoming connections for a particular user table. I've avoided making "one big ass table", but I'm not sure the cost is worth it. Store in separate tables all outgoing AND incoming connections for each user table (using a flag to distinguish between incoming vs outgoing). This is the idea I'm leaning towards, but it will essentially double the total DB storage for all the connections (as each connection will be stored in two tables). It also means I have to make sure connection information is kept in sync in both places. This is obviously not ideal but it does mean that when I load a user table, I only need to load one 'connection' table and have all the information I need. This also presents a separate problem, that of connection object creation. Since each user table has a list of all connections, there are two opportunities for a connection object to be made. However, connections objects (designed to facilitate communication between records) should only be created once. This means I'll have to devise a common caching/factory object to make sure only one connection object is made per connection. Does anybody have any ideas of a better way to do this? Once I've committed to a particular design pattern I'm pretty much stuck with it, so I want to make sure I've come up with the best one possible.

    Read the article

  • Best S.E.O. practice for backlinking etc

    - by Aaron Lee
    I'm currently working on a website that I am really looking to optimise in terms of search engines, i've been submitting between 5-20 directory submissions daily, i've validated and optimised my code and i've joined a lot of forums etc to speak of the website in question, however, I don't seem to be making much of an impact in terms of Google. I know that S.E.O. takes a while to start making an impact, and that Google prefers sites that a regularly updated and aged, but are there any more practices that can really help with organic results in Search engines. I have looked on Google itself, and a few other SE's but nobody is willing to talk about extensive S.E.O. practices as they normally don't want people knowing their formula's for S.E.O., also does anyone know of a decent piece of software that really looks into the in's and out's of your page and provides feedback, I usually use http://www.woorank.com, but only using one program doesn't show if it's exactly correct in what it's saying. If anyone could help it would be much appreciated, thank you very much.

    Read the article

  • Links for PrDC10 Session Visual Studio 2010 Testing Tools

    - by Aaron Kowall
    Here are the links I promised to post from my session on Visual Studio 2010 Testing Tools. To download and configure the TFS 2010 Virtual Machine the best instructions are here: http://blogs.msdn.com/b/briankel/archive/2010/03/18/now-available-visual-studio-2010-release-candidate-virtual-machines-with-sample-data-and-hands-on-labs.aspx To download and configure the Lab Management Virtual Machine, the best instructions are here: http://blogs.msdn.com/b/lab_management/archive/2010/02/12/one-box-lab-management-walkthrough.aspx Thanks to all that attended my presentation!  Hope you learned a bit. Technorati Tags: PrDC10,TFS 2010,VHD,Lab Management

    Read the article

  • Not Drowning, Being Saved By a Dog

    - by Aaron Lazenby
    Really, there's no dog in this story. Just a week without travel to get some actual work done.I had plans to blog ambitiously from from Collaborate 10 (Wi-Fi was limited; iPad is still untested), but it's a much busier week than your agenda suggests.Scheduling sessions is one thing: you can count on those chunks of time being lost to the universe. It's the bumping into people in the hall and dropping in on an impromptu lunch that really knocks things out of whack.Good think too: I met with some great folks from

    Read the article

  • Can't remove libreoffice3.5-dict-de package

    - by Aaron Digulla
    I just installed the wrong version of LibreOffice (x86 instead of 64bit). So I tried to remove it but I get this error: > aptitude remove libreoffice3.5-dict-de Couldn't find package "libreoffice3.5-dict-de". However, the following packages contain "libreoffice3.5-dict-de" in their name: libreoffice3.5-dict-de Couldn't find package "libreoffice3.5-dict-de". However, the following packages contain "libreoffice3.5-dict-de" in their name: libreoffice3.5-dict-de No packages will be installed, upgraded, or removed. 0 packages upgraded, 0 newly installed, 0 to remove and 2 not upgraded. Need to get 0 B of archives. After unpacking 0 B will be used. The same happens when I try to use "Software Management". What's going on? aptitude: 0.6.4-1ubuntu2 Ubuntu 11.10 oneiric

    Read the article

  • Good resources for language design

    - by Aaron Digulla
    There are lots of books about good web design, UI design, etc. With the advent of Xtext, it's very simple to write your own language. What are good books and resources about language design? I'm not looking for a book about compiler building (like the dragon book) but something that answers: How to create a grammar that is forgiving (like adding optional trailing commas)? Which grammar patterns cause problems for users of a language? How create a compact grammar without introducing ambiguities

    Read the article

  • Simulating an object floating on water

    - by Aaron M
    I'm working on a top down fishing game. I want to implement some physics and collision detection regarding the boat moving around the lake. I would like for be able to implement thrust from either the main motor or trolling motor, the effect of wind on the object, and the drag of the water on the object. I've been looking at the farseer physics engine, but not having any experience using a physics engine, I am not quite sure that farseer is suitable for this type of thing(Most of the demos seem to be the application of gravity to a vertical top/down type model). Would the farseer engine be suitable? or would a different engine be more suitable?

    Read the article

  • SDL 1.2 reports wrong screen size

    - by Aaron Digulla
    I have a multi-monitor setup with two displays, both 1920x1200. In games, I can only select resolutions 1920x1200 (like 2560x1200) which makes games unusable. Full screen doesn't work either because it switches one display to 800x600 which means I can't reach the close button... I have to kill the game and then, I have to restore my desktop because all windows are moved/resized. How can I force SDL to use any resolution that I want?

    Read the article

  • glColor3f Setting colour

    - by Aaron
    This draws a white vertical line from 640 to 768 at x512: glDisable(GL_TEXTURE_2D); glBegin(GL_LINES); glColor3f((double)R/255,(double)G/255,(double)B/255); glVertex3f(SX, -SPosY, 0); // origin of the line glVertex3f(SX, -EPosY, 0); // ending point of the line glEnd(); glEnable(GL_TEXTURE_2D); This works, but after having a problem where it wouldn't draw it white (Or to any colour passed) I discovered that disabling GL_TEXTURE_2D Before drawing the line, and the re-enabling it afterwards for other things, fixed it. I want to know, is this a normal step a programmer might take? Or is it highly inefficient? I don't want to be causing any slow downs due to a mistake =) Thanks

    Read the article

  • How to install drivers for NVIDIA GeForce FX 5200 on Precise

    - by Aaron Simmons
    Ok, I'm a total Linux noob. I was able to install 12.4 without issue, except that it says in the system settings that the graphics card/driver is "unknown". I have NVIDIA GeForce FX 5200, and have not been able to get it installed. I've found the driver at NVIDIA but couldn't figure out how to actually install it. I found instructions that used apt-get to automatically find the current driver and install it, and that came close. At least then it showed up in the 3rd party drivers list. It said it was installed but not being used? And I was unable to find out why that might be, or how to get the system to use it Two questions: 1. CAN/SHOULD my graphics card work with 12.4? 2. If so, HOW? I'm running a 100% fresh install, so what's the step-by-step from there?

    Read the article

  • Kiosk Mode Coding in Chromium

    - by Aaron
    I don't know how easy this would be, since I don't know anything about it, but I need an Ubuntu setup where the machine boots up, displays the login for a few seconds allowing a chance to log in as an admin, and then precedes to automatically log in to a user account which directly opens Chromium (any other browser is acceptable) in a kiosk mode where only the web content is visible, all Chromium keyboard shortcuts are disabled, and all but a select few websites are blocked, redirecting back to the home page after an "Unauthorized web page" warning comes up if the URL constraint is violated. Is it possible to code a kiosk setup like this, or am I asking for too much? If I'm simply uninformed, and there is already much documentation on anything like this, please redirect me to an appropriate page. If you can code or set up something like my description, please reply with step-by-step instructions, and instructions on how to modify the elements of the kiosk mode. Thank you in advance for any help. (Note: I'm currently using Ubuntu 10.04, but any distribution would work.)

    Read the article

  • Why am I not getting an sRGB default framebuffer?

    - by Aaron Rotenberg
    I'm trying to make my OpenGL Haskell program gamma correct by making appropriate use of sRGB framebuffers and textures, but I'm running into issues making the default framebuffer sRGB. Consider the following Haskell program, compiled for 32-bit Windows using GHC and linked against 32-bit freeglut: import Foreign.Marshal.Alloc(alloca) import Foreign.Ptr(Ptr) import Foreign.Storable(Storable, peek) import Graphics.Rendering.OpenGL.Raw import qualified Graphics.UI.GLUT as GLUT import Graphics.UI.GLUT(($=)) main :: IO () main = do (_progName, _args) <- GLUT.getArgsAndInitialize GLUT.initialDisplayMode $= [GLUT.SRGBMode] _window <- GLUT.createWindow "sRGB Test" -- To prove that I actually have freeglut working correctly. -- This will fail at runtime under classic GLUT. GLUT.closeCallback $= Just (return ()) glEnable gl_FRAMEBUFFER_SRGB colorEncoding <- allocaOut $ glGetFramebufferAttachmentParameteriv gl_FRAMEBUFFER gl_FRONT_LEFT gl_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING print colorEncoding allocaOut :: Storable a => (Ptr a -> IO b) -> IO a allocaOut f = alloca $ \ptr -> do f ptr peek ptr On my desktop (Windows 8 64-bit with a GeForce GTX 760 graphics card) this program outputs 9729, a.k.a. gl_LINEAR, indicating that the default framebuffer is using linear color space, even though I explicitly requested an sRGB window. This is reflected in the rendering results of the actual program I'm trying to write - everything looks washed out because my linear color values aren't being converted to sRGB before being written to the framebuffer. On the other hand, on my laptop (Windows 7 64-bit with an Intel graphics chip), the program prints 0 (huh?) and I get an sRGB default framebuffer by default whether I request one or not! And on both machines, if I manually create a non-default framebuffer bound to an sRGB texture, the program correctly prints 35904, a.k.a. gl_SRGB. Why am I getting different results on different hardware? Am I doing something wrong? How can I get an sRGB framebuffer consistently on all hardware and target OSes?

    Read the article

  • Logic in Entity Components Systems

    - by aaron
    I'm making a game that uses an Entity/Component architecture basically a port of Artemis's framework to c++,the problem arises when I try to make a PlayerControllerComponent, my original idea was this. class PlayerControllerComponent: Component { public: virtual void update() = 0; }; class FpsPlayerControllerComponent: PlayerControllerComponent { public: void update() { //handle input } }; and have a system that updates PlayerControllerComponents, but I found out that the artemis framework does not look at sub-classes the way I thought it would. So all in all my question here is should I make the framework aware of subclasses or should I add a new Component like object that is used for logic.

    Read the article

  • My collection of favourite TFS utilities

    - by Aaron Kowall
    So, you’re in charge of your company or team’s Team Foundation Server.  Wish it was easier to manage, administer, extend?  Well, here are a few utilities that I highly recommend looking at. I’ve recently had need to rebuild my laptop and upgrade my local TFS environment to TFS 2012 Update 1.  This gave me cause to enumerate some of the utilities I like to have on hand. One of the reasons I love to use TFS on projects is that it’s basically a complete ALM toolkit.  Everything from Task Management, Version Control, Build Management, Test Management, Metrics and Reporting are all there ‘in the box’.  However, no matter how complete a product set it, there are always ways to make it better.  Here are a list of utilities and libraries that are pretty generally useful.  this is not intended to be an exhaustive list of TFS extensions but rather a set that I recommend you look at.  There are many more out there that may be applicable in one scenario or another.  This set of tools should work with TFS 2012 or 2010 if you grab the right version. Most of these tools (and more) are available from the Visual Studio Gallery or CodePlex. General TFS Power Tools – This is ‘the’ collection of utilities and extensions delivered by the Product Group.  Highly recommended from here are the Best Practice Analyzer for ensuring your TFS implementation is healthy and the Team Foundation Server Backups to ensure your TFS databases are backed up correctly. TFS Administrators Toolkit – helps make updates to work item types and reports across many team projects.  Also provides visibility of disk usage by finding large files in version control or test attachments to assist in managing storage utilization. Version Control Git-TF - a set of cross-platform, command line tools that facilitate sharing of changes between TFS and Git. These tools allow a developer to use a local Git repository, and configure it to share changes with a TFS server.  Great for all Git lovers who must integrate into a TFS repository. Testing TFS 2012 Tester Power Tool – A utility for bulk copying test cases which assists in an approach for managing test cases across multiple releases.  A little plug that this utility was written and maintained by Anna Russo of Imaginet where I also work. Test Scribe - A documentation power tool designed to construct documents directly from the TFS for test plan and test run artifacts for the purpose of discussion, reporting etc. Reporting Community TFS Report Extensions - a single repository of SQL Server Reporting Services report for Team Foundation 2010 (and above).  Check out the Test Plan Status report by Imaginet’s Steve St. Jean.  Very valuable for your test managers. Builds TFS Build Manager – A great utility if you are build manager over a complex build environment with many TFS build definitions. Community TFS Build Extensions – contains many custom build activities.  Current release binaries are for TFS 2010 but many of the activities can be recompiled for use with TFS 2012. While compiling this list, I was surprised by the number of TFS utilities and extensions I no longer use/need in TFS 2012 because of the great work by the TFS team addressing many gaps since the 2010 release. Are there any utilities you depend on that I’ve missed?  I’d love to hear about them in the comments!

    Read the article

  • Can I use Ubuntu Server to replace our Windows environment?

    - by Aaron English
    I have recently been put in charge of a network overhaul for our company. I have done plenty with Ubuntu in school but it has been a few years. I would like to replace our current servers with Ubuntu, although I am unaware if it will work. Our current environment runs a Domain, Exchange, and VPN. I know there are solutions capable for this. I guess my man worry is will windows 7 and windows XP be able to use Ubuntu as a Domain Controller? If anyone has had success with this I would love some input. I have a meeting in a couple months that I am suppose to explain our plan. Thank you.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >