Search Results

Search found 10644 results on 426 pages for 'flash integration'.

Page 332/426 | < Previous Page | 328 329 330 331 332 333 334 335 336 337 338 339  | Next Page >

  • Java Magazine: Java at Sea!

    - by Tori Wieldt
    The September/October issue of Java Magazine is now out, with several great Java stories, including: Java At Sea? Liquid Robotics charts a new course with expert help from Java pioneer James Gosling.?  ?Duke’s Choice AwardsMeet this year’s winners! (The awards will be presented at the JavaOne Sunday night reception at the Taylor Street Cafe.)Looking Ahead to Project LambdaJava Language Architect Brian Goetz on the importance of lambda expressions.JCP Q&A: Ben EvansThe London JUG representative talks about the JCP and the Java community.Java EE Connector Architecture 1.6Adam Bien on deep integration with connector services in a lean way.DataFX: Populate JavaFX Controls with Real-World DataTools to retrieve, parse, and render data in a variety of JavaFX controls. Fix ThisStephen Chin challenges your JavaFX skills. Java Magazine is a bi-monthly online publication. It includes technical articles on the Java language and platform; Java innovations and innovators; JUG and JCP news; Java events; links to online Java communities; and videos and multimedia demos. Subscriptions are free.

    Read the article

  • Multiplayer online game engine/pipeline

    - by Slav
    I am implementing online multiplayer game where client must be written in AS3 (Flash) to embed game into browser and server in C++ (abstract part of which is already written and used with other games). Networking models may differ from each other, but currently I'm looking toward game's logic run on both client and server parts but they're written on different languages while it's not the main problem. My previous game (pretty big one - was implemented with efforts of ~5 programmers in 1.5 years) was mainly "written" within electronic tables as structured objects with implemented inheritance: was written standalone tool which generated AS3 and C++ (languages of platforms to which the game was published) using specified electronic tables file (.xls or .ods). That file contained ~50 tables with ~50 rows and ~50 columns each and was mainly written by game designers which do not know any programming languages. But that game was single-player. Having declared problem with my currently implementing MMO, I'm looking toward some vast pipeline, where will be resolved such problems like: game objects descriptions (which starships exist within game, how much HP they have, how fast move, what damage deal...) actions descriptions (what players or NPCs can do: attack each other, collect resources, build structures, move, teleport, cast spells) - actions are transmitted through server between clients influences (what happens when specified action applied on specified object, e.i "Ship A attacked Ship B: field "HP" of Ship B reduced by amount of field "damage" of Ship A" Influences can be much more difficult, yes, e.i. "damage is twice it's size when Ship has =5 allies around him in a 200 units range during night" and so on. If to be able to write such logic within some "design document" it will be easily possible to: let designers to do their job without programmer's intervention or any bug-prone programming validate described logic transfer (transform, convert) to any programming language where it will be executed Did somebody worked on something like that? Is there some tools/engines/pipelines which concernes with it? How to handle all of this problems simultaneously in a best way or do I properly imagine my tasks and problems to myself?

    Read the article

  • Video on Hyperion Tax Provision

    - by Lia Nowodworska - Oracle
    ( in via Jan) EPM Information Development has asked us to remind you about the new video available for Hyperion Tax Provision. You can view it on the OracleEPMWebcasts YouTube channel here: http://bit.ly/1jxLlCy An information rich 4:40 minutes of your time.  So please take a look. The video gives a brief overview of the main features of  Hyperion Tax Provision. You will learn ... That Tax Provision and reporting System builds on the Hyperion Financial Close Reporting Platform That much of the Tax Provision flow process is similar to the Financial Close Process and that the modules have been aligned to work together very closely. That HTP enables you to integrate Book- and Tax Reporting on a common platform. That It uses the technology of HFM, ties in with SmartView and can be used with Hyperion Financial Reporting. That the native integration between the Financial System and the Tax System creates transparency for the Tax Departments and removes bottlenecks in the tax process. More technical information can be found here: Oracle Hyperion Tax Provision Data Sheet Oracle Hyperion Tax Provision White Paper Oracle Hyperion Tax Provision Documentation If you have another 45 minutes to spare and want to get into greater detail, then you can check out the recording of an Advisor Webcast that we did earlier last year: You can find this via KM Doc Oracle Business Analytics Advisor Webcast Schedule and Archive Recordings (Doc ID 1456233.1) -> Select the Tab "Archived 2013" and it is the third from the top: "Oracle Hyperion Tax Provision - Features and Overview with Demo" If you have questions towards that Advisor Webcast, you may participate in the Community Discussion about it. (layout and post: Torben, authorized: Lia)

    Read the article

  • Internet Explorer 9 is coming Monday to a web near you

    - by brian_ritchie
    Internet Explorer 9 is finally here...well almost.  Microsoft is releasing their new browser on March 14, 2011. IE9 has a number of improvements, including: Faster, Faster, Faster.  Did I mention it is faster?   With the new browsers coming out from Mozilla, Google, and Microsoft, there have been a flood of speed test coverage.  Chrome has long held the javascript speed crown.  But according to Steven J. Vaughan-Nichols over at ZDNET..."for the moment at least IE9 is actually the fastest browser I’ve tested to date."  He came to this revelation after figuring out that the 32-bit version of IE9 has the new Chakra JIT (the 64-bit version doesn't).  It also has a DirectX-based rendering engine so it can do cool tricks once reserved for desktop applications. Windows 7 Desktop Integration.  Read my post for more details.  Unfortantely, they didn't integrate my ideas...at least not yet :) Hot new UI.  Ok, they "borrowed" some ideas from Chrome...but that is the best form of flattery. Standards Compliance.  A real focus on HTML5 and CSS3.  Definite goodness for developers. So, go get yourself some IE9 on Monday and enjoy! 

    Read the article

  • Master Data Management – A Foundation for Big Data Analysis

    - by Manouj Tahiliani
    While Master Data Management has crossed the proverbial chasm and is on its way to becoming mainstream, businesses are being hammered by a new megatrend called Big Data. Big Data is characterized by massive volumes, its high frequency, the variety of less structured data sources such as email, sensors, smart meters, social networks, and Weblogs, and the need to analyze vast amounts of data to determine value to improve upon management decisions. Businesses that have embraced MDM to get a single, enriched and unified view of Master data by resolving semantic discrepancies and augmenting the explicit master data information from within the enterprise with implicit data from outside the enterprise like social profiles will have a leg up in embracing Big Data solutions. This is especially true for large and medium-sized businesses in industries like Retail, Communications, Financial Services, etc that would find it very challenging to get comprehensive analytical coverage and derive long-term success without resolving the limitations of the heterogeneous topology that leads to disparate, fragmented and incomplete master data. For analytical success from Big Data or in other words ROI from Big Data Investments, businesses need to acquire, organize and analyze the deluge of data to make better decisions. There will need to be a coexistence of structured and unstructured data and to maintain a tight link between the two to extract maximum insights. MDM is the catalyst that helps maintain that tight linkage by providing an understanding about the identity, characteristics of Persons, Companies, Products, Suppliers, etc. associated with the Big Data and thereby help accelerate ROI. In my next post I will discuss about patterns for co-existing Big Data Solutions and MDM. Feel free to provide comments and thoughts on above as well as Integration or Architectural patterns.

    Read the article

  • Is committing/checking in code everyday a good practice?

    - by ArtB
    I've been reading Martin Fowler's note on Continuous Integration and he lists as a must "Everyone Commits To the Mainline Every Day". I do not like to commit code unless the section I'm working on is complete and that in practice I commit my code every three days: one day to investigate/reproduce the task and make some preliminary changes, a second day to complete the changes, and a third day to write the tests and clean it up^ for submission. I would not feel comfortable submitting the code sooner. Now, I pull changes from the repository and integrate them locally usually twice a day, but I do not commit that often unless I can carve out a smaller piece of work. Question: is committing everyday such a good practice that I should change my workflow to accomodate it, or it is not that advisable? Edit: I guess I should have clarified that I meant "commit" in the CVS meaning of it (aka "push") since that is likely what Fowler would have meant in 2006 when he wrote this. ^ The order is more arbitrary and depends on the task, my point was to illustrate the time span and activities, not the exact sequence.

    Read the article

  • Multiple displays using AMD drivers

    - by Halik
    I am currently running a dual display setup with nVidia 8800GTS video card, on a Ubuntu 12.10 box. The current setup uses nVidia TwinView to render the image on a 1920x1200 display and 1600x1200 one. I'm planning to add a third, 1280x1024 display to the setup. The change will require me to upgrade my GFX card to one supporting triple displays. I'll probably go with Sapphire Radeon 7770 (FLEX edition, to avoid additional active DP-DVI adapters). Before I invest in new GFX I wanted to ask - how well the AMD drivers will support such a setup. It does not matter whether it's fglrx or the OSS ones. If I remember correctly, when running Fedora on a Radeon x800, I had 'void' areas above and below the working area on my second display. The desktop was rendered in 1920+1280 width and 1200 height (which left 176px of vertical space accessible for my cursor and windows but not displayed on the screen - I'd prefer to avoid that). It may have very well been my misconfiguration back then. Generally, are there any solutions from AMD on par with TwinView? Or is it a non-issue at all? Also, I'm wondering about the usual stuff - hardware h264 decoding support, glitch-free flash support, any issues with Compiz/Unity?

    Read the article

  • Is it a must to focus on one specific IT subject to be succesful?

    - by Ahmet Yildirim
    Lately I'm deeply disturbed by the thought that I'm still not devoted to one specific IT subject after so many years of doing it as a hobby. I've been in so many different IT related hobbies since I was 12. I have spent 8 years and now I'm 20 and just finished freshman year at Computer Eng. Just to summarize the variety: 3D Game Dev. and Modelling (Acknex, Irrlicht , OpenGL, GLES, 3DSMAX) Mobile App.Dev (Symbian, Maemo, Android) Electronis (Arduino) Web.Dev. (PHP, MYSQL, Javascript, Jquery, RaphaelJS, Canvas, Flash etc.) Computer Vision (OpenCV) I need to start making money. But I'm having problem to pick the correct IT business to do so. Is it a problem to have interest in so many different IT subjects?(in business world) I'm having a lot of fun by doing all those stuff from time to time. Other than making money I also noticed that having so many different interests is lowering my productivity. But I'm still having difficulty to pick one. I'm feeling close to all those subjects (time to time).

    Read the article

  • Dealing with inflexible programmers.

    - by Singleton
    Sometimes programmers who work on a project for long time get inflexible, and it becomes difficult to reason with them. Even if we do manage to convince them, they can be unlikely to implement our suggestions. For instance, I recently joined a project where the build & release process is too complicated and has unnecessary roadblocks. I suggested that we get rid of some of the development overhead (like filling a few spreadsheets) just by integrating defect management and version control tools (both are IBM-Rational tools so integration can be a very easy one-off effort). Also, if we use tools like Maven & Ant (the project involves Java and some COTS products) build & release can be simplified which should reduce manual errors & intervention. I managed to convince others and I'm ready to put in the effort to develop a proof of concept. But the ‘Senior’ developer is not willing, possibly because the current process makes him more valuable. How do we handle this situation without developing friction in the team?

    Read the article

  • Armchair CEO: Windows

    - by Scott Kuhl
    Originally posted on: http://geekswithblogs.net/scottkuhl/archive/2013/10/12/armchair-ceo-windows.aspxWelcome to part 3 of my Armchair CEO series where I prove just why I’m not running Microsoft.  In this insightful edition I’ll tell you how to make Windows, the golden flagship of Microsoft, a better product. Android Apps Windows Phone is not the only app store that needs a boost.  But unlike Windows Phone, there is a very easy way to get a lot more apps on your Windows PC: BlueStacks.  Right now BlueStacks has 3 things going against it: its UI integration is a desktop app hack, it does not work on RT, and no one know about it.  All three could be fixed if Microsoft bought the company or pulled off the same thing.  The store can be designed to give preference to Windows Store apps but it closes a lot of holes quickly. The Desktop Experience Windows should switch between desktop mode and tablet mode automatically.  Laptops without touch and desktops should work a lot more like Windows 7.  The PC should boot to desktop and Metro apps should run in windows, like MetroMix.  A tablet should boot to the Start Screen by default and pretty much work the same way it does now in 8.1.  Touch laptops should give the user an in your face option on first boot to pick the experience.  And finally, the experience can be changed automatically if the PC is docked or has external monitors hooked up. Death of the Desktop This might seem completely opposite to the last feature, but its not.  I should have no need to ever see the desktop from Start Screen mode.  Every settings needs to be available, an amazing port of the file explorer is needed, and Office Metro must be released.  Desktop apps should also be able to run in full screen mode like other Metro apps.

    Read the article

  • Notification framework for object lifecycle

    - by rlandster
    I am looking for an application, framework, or library that would help us with "object life-cycle management". There are many things that are created for users, departments, and services that, all too often, are left unmanaged. Some examples: user accounts groups SSL certificates access rights databases software license provisionings storage list-serve accounts These objects are created and managed by a wide variety of applications and systems. Typically, a user (person) requests (either explicitly or implicitly) one of these objects. A centralized management tool would help us manage such administration chores as: What objects does user X currently own/manage? Move the ownership of object P to user X; move all objects owned by user X (who was just been fired) to user Y. For all objects of type T that have expired be sure the objects have been disabled or deleted by their provider. How many active (expired, about-to-expire) objects of type P are there? Send periodic notifications to all users who own active objects of type P reminding them of what they own. There is a security alert for objects of type P; send a notification to all users who own these types of objects to take a specific remedial action. Delete or disable a set of objects based on expiration (or some other criteria). These objects are directly managed through their own applications (Active Directory, MySql, file systems, etc.) and may even have their own notification systems, but I want to centralize this into an "object management system". The OMS should allow the association with an external identity provider that defines who the users and groups are (e.g., LDAP, Active Directory) creation of objects association of an object to a specific user and/or group association with an expiration date creation of flexible reporting including letting users know what objects they currently own and their expiration dates integration with an external object "provider" via a plug-in We could write something from scratch, but I am hoping there is something already out there that will help, either an entire application or a set of libraries that provide much of what is needed. Any ideas?

    Read the article

  • Why is Ubuntu One slow to sync in 11.10, either backup or any sub-folder contents?

    - by pst007x
    I have been trying to sync my documents folder of 1.4GB, it still hasn't worked and it has been syncing for a month. The top level syncs, files and folders in the Document folders, but contents of sub-folders just hang. (Gave up and stopped syncing this folder) However,I have tried using the backup facility in 11.10, to backup to Ubuntu One.... I upgraded my HDD space in Ubuntu One. It has been going now for 24hours-ish and only backed up what looks like a couple of percent. (By the way what an excellent idea to backup to Ubuntu One, if only we could get it to actually work! :-o) The odd thing is I can sync to drop box within hours, rather than months. This is bad, and has been an issue since Ubuntu One's release. I have reported this problem and there were promises in later releases this would be fixed, but it hasn't. Canonical cannot help either... I posted on several blogs, a lot of people have the same problem but no fixes. So do I use dropbox or another service, until it is sorted, as Ubuntu does not seem to see this as an issue, I think a fix will be a long time in coming. (However,I love the potential of Ubuntu One and the integration with the OS) Yes my internet speeds are fine, etc... :-) No firewall (sudo ufw status: STATUS: INACTIVE), No Proxy, etc NB: I have raised this as a separate question to others posted here, because my question relates to Ubuntu 11.10, though I have commented elsewhere for help. Plus my question also relates to deja-dup backup to Ubuntu One. Thanks

    Read the article

  • mount ext4 formated external drive->the drives green light won't stop flashing

    - by Gohlool
    I've installed kubuntu (after 10 years I am trying to play with linux) and manged to attach a external 1TB HDD drive over USB! The drive was formatted with NTFS and everything was working OK. I also changed the /etc/fstab here is my ntfs mount setting: /dev/sdb1 /media/samsung nts-3g auto,user,uid=1000,gid=1000,fmask=000,utf-8 0, 0 Now, I've reparationed the drive and formated it with ext4 filesystem! change my fstb like: /dev/sdb1 /media/samsung ext4 defaults,noatime 0, 0 now, when I plug my dive/or call sudo mount -a, my external drive's green light starts to flash and won't stop, but mount works .... What is the Problem? is this because of ext4? because with NTFS this won't happen! btw. after changing the owner of the /media/samsung and setting permissions 777, I can also access my drive like creating new folder atc. (although is's flashing constantly)! What is my mistake? btw. can you please let me know how to set the owner and the permissions for my /media/samsung directory in fstab for ext4 like I did it for NTFS? Thanks in advance

    Read the article

  • Thunderbird keeps crashing Ubuntu 12.04 64 bit

    - by maurizio ribera d'alcala'
    I have been using Thunderbird for years under different versions of Ubuntu, including 12.04 since its release. From a couple of days it keeps crashing after a few seconds after start. I tried to reinstall it creating a new profile and copying the old mail, file by file. After one day of normal functioning it started again to crash. Mozilla is receiving the crash reports. Following is the content of the last one: Add-ons: [email protected]:15.0,[email protected]:0.3.11,[email protected]:0.9.3,[email protected]:3.4.1,[email protected]:15.0,[email protected]:0.5.1,{b4447f60-db9c-11da-a94d-0800200c9a66}:0.9.1,{972ce4c6-7e08-4474-a285-3208198ce6fd}:15.0 BuildID: 20120827103657 CrashTime: 1347200254 EMCheckCompatibility: true Email: [email protected] FramePoisonBase: 7ffffffff0dea000 FramePoisonSize: 4096 InstallTime: 1346431480 Notes: OpenGL: Tungsten Graphics, Inc -- Mesa DRI Intel(R) Sandybridge Mobile -- 3.0 Mesa 8.0.2 -- texture_from_pixmap ProductID: {3550f703-e582-4d05-9a08-453d09bdfdc6} ProductName: Thunderbird ReleaseChannel: release SecondsSinceLastCrash: 1109 StartupTime: 1347200242 Theme: classic/1.0 Throttleable: 1 Vendor: Version: 15.0 I started Thunderbird in safe mode and tested all the add-ons. Apparently is the 'Unity Launcher Integration' that creates the problem. I say apparently because I want two wait for two-three days to be sure TB returns to its regular functioning. Is this a bug? can it be solved

    Read the article

  • Visual Studio 2010 Service Pack 1 Released

    - by krislankford
    The VS 2010 SP 1 release was simultaneous to the release of TFS 2010 SP1 and includes support for the Project Server Integration Feature Pack and updates to .NET Framework 4.0. The complete Visual Studio SP1 list including Test and Lab Manager: http://support.microsoft.com/kb/983509 The release addresses some of the most requested features from customers of Visual Studio 2010 like better help support IntelliTrace support for 64bit and SharePoint Silverlight 4 Tools in the box unit testing support on .NET 3.5 a new performance wizard for Silverlight Another major addition is the announcement of Unlimited Load Testing for Visual Studio 2010 Ultimate with MSDN Subscribers! The benefits of Visual Studio 2010 Load Test Feature Pack and useful links: Improved Overall Software Quality through Early Lifecycle Performance Testing: Lets you stress test your application early and throughout its development lifecycle with realistically modeled simulated load. By integrating performance validations early into your applications, you can ensure that your solution copes with real-world demands and behaves in a predictable manner, effectively increasing overall software quality. Higher Productivity and Reduced TCO with the Ability to Scale without Incremental Costs: Development teams no longer have to purchase Visual Studio Load Test Virtual User Pack 2010. Download the Visual Studio 2010 Load Test Feature Pack Deployment Guide Get started with stress and performance testing with Visual Studio 2010 Ultimate: Quality Solutions Best Practice: Enabling Performance and Stress Testing throughout the Application Lifecycle Hands-On-Lab: Introduction to Load Testing with ASP.NET Profile in Visual Studio 2010 How-Do-I videos: Use ASP.NET Profiler in Load Tests Use Network Emulation in Load Tests VHD/VPC walkthrough: Getting Started with Load and Performance Testing Best Practice guidance: Visual Studio Performance Testing Quick Reference Guide

    Read the article

  • Project Jigsaw: Late for the train: The Q&A

    - by Mark Reinhold
    I recently proposed, to the Java community in general and to the SE 8 (JSR 337) Expert Group in particular, to defer Project Jigsaw from Java 8 to Java 9. I also proposed to aim explicitly for a regular two-year release cycle going forward. Herewith a summary of the key questions I’ve seen in reaction to these proposals, along with answers. Making the decision Q Has the Java SE 8 Expert Group decided whether to defer the addition of a module system and the modularization of the Platform to Java SE 9? A No, it has not yet decided. Q By when do you expect the EG to make this decision? A In the next month or so. Q How can I make sure my voice is heard? A The EG will consider all relevant input from the wider community. If you have a prominent blog, column, or other communication channel then there’s a good chance that we’ve already seen your opinion. If not, you’re welcome to send it to the Java SE 8 Comments List, which is the EG’s official feedback channel. Q What’s the overall tone of the feedback you’ve received? A The feedback has been about evenly divided as to whether Java 8 should be delayed for Jigsaw, Jigsaw should be deferred to Java 9, or some other, usually less-realistic, option should be taken. Project Jigsaw Q Why is Project Jigsaw taking so long? A Project Jigsaw started at Sun, way back in August 2008. Like many efforts during the final years of Sun, it was not well staffed. Jigsaw initially ran on a shoestring, with just a handful of mostly part-time engineers, so progress was slow. During the integration of Sun into Oracle all work on Jigsaw was halted for a time, but it was eventually resumed after a thorough consideration of the alternatives. Project Jigsaw was really only fully staffed about a year ago, around the time that Java 7 shipped. We’ve added a few more engineers to the team since then, but that can’t make up for the inadequate initial staffing and the time lost during the transition. Q So it’s really just a matter of staffing limitations and corporate-integration distractions? A Aside from these difficulties, the other main factor in the duration of the project is the sheer technical difficulty of modularizing the JDK. Q Why is modularizing the JDK so hard? A There are two main reasons. The first is that the JDK code base is deeply interconnected at both the API and the implementation levels, having been built over many years primarily in the style of a monolithic software system. We’ve spent considerable effort eliminating or at least simplifying as many API and implementation dependences as possible, so that both the Platform and its implementations can be presented as a coherent set of interdependent modules, but some particularly thorny cases remain. Q What’s the second reason? A We want to maintain as much compatibility with prior releases as possible, most especially for existing classpath-based applications but also, to the extent feasible, for applications composed of modules. Q Is modularizing the JDK even necessary? Can’t you just put it in one big module? A Modularizing the JDK, and more specifically modularizing the Java SE Platform, will enable standard yet flexible Java runtime configurations scaling from large servers down to small embedded devices. In the long term it will enable the convergence of Java SE with the higher-end Java ME Platforms. Q Is Project Jigsaw just about modularizing the JDK? A As originally conceived, Project Jigsaw was indeed focused primarily upon modularizing the JDK. The growing demand for a truly standard module system for the Java Platform, which could be used not just for the Platform itself but also for libraries and applications built on top of it, later motivated expanding the scope of the effort. Q As a developer, why should I care about Project Jigsaw? A The introduction of a modular Java Platform will, in the long term, fundamentally change the way that Java implementations, libraries, frameworks, tools, and applications are designed, built, and deployed. Q How much progress has Project Jigsaw made? A We’ve actually made a lot of progress. Much of the core functionality of the module system has been prototyped and works at both compile time and run time. We’ve extended the Java programming language with module declarations, worked out a structure for modular source trees and corresponding compiled-class trees, and implemented these features in javac. We’ve defined an efficient module-file format, extended the JVM to bootstrap a modular JRE, and designed and implemented a preliminary API. We’ve used the module system to make a good first cut at dividing the JDK and the Java SE API into a coherent set of modules. Among other things, we’re currently working to retrofit the java.util.ServiceLoader API to support modular services. Q I want to help! How can I get involved? A Check out the project page, read the draft requirements and design overview documents, download the latest prototype build, and play with it. You can tell us what you think, and follow the rest of our work in real time, on the jigsaw-dev list. The Java Platform Module System JSR Q What’s the relationship between Project Jigsaw and the eventual Java Platform Module System JSR? A At a high level, Project Jigsaw has two phases. In the first phase we’re exploring an approach to modularity that’s markedly different from that of existing Java modularity solutions. We’ve assumed that we can change the Java programming language, the virtual machine, and the APIs. Doing so enables a design which can strongly enforce module boundaries in all program phases, from compilation to deployment to execution. That, in turn, leads to better usability, diagnosability, security, and performance. The ultimate goal of the first phase is produce a working prototype which can inform the work of the Module-System JSR EG. Q What will happen in the second phase of Project Jigsaw? A The second phase will produce the reference implementation of the specification created by the Module-System JSR EG. The EG might ultimately choose an entirely different approach than the one we’re exploring now. If and when that happens then Project Jigsaw will change course as necessary, but either way I think that the end result will be better for having been informed by our current work. Maven & OSGi Q Why not just use Maven? A Maven is a software project management and comprehension tool. As such it can be seen as a kind of build-time module system but, by its nature, it does nothing to support modularity at run time. Q Why not just adopt OSGi? A OSGi is a rich dynamic component system which includes not just a module system but also a life-cycle model and a dynamic service registry. The latter two facilities are useful to some kinds of sophisticated applications, but I don’t think they’re of wide enough interest to be standardized as part of the Java SE Platform. Q Okay, then why not just adopt the module layer of OSGi? A The OSGi module layer is not operative at compile time; it only addresses modularity during packaging, deployment, and execution. As it stands, moreover, it’s useful for library and application modules but, since it’s built strictly on top of the Java SE Platform, it can’t be used to modularize the Platform itself. Q If Maven addresses modularity at build time, and the OSGi module layer addresses modularity during deployment and at run time, then why not just use the two together, as many developers already do? A The combination of Maven and OSGi is certainly very useful in practice today. These systems have, however, been built on top of the existing Java platform; they have not been able to change the platform itself. This means, among other things, that module boundaries are weakly enforced, if at all, which makes it difficult to diagnose configuration errors and impossible to run untrusted code securely. The prototype Jigsaw module system, by contrast, aims to define a platform-level solution which extends both the language and the JVM in order to enforce module boundaries strongly and uniformly in all program phases. Q If the EG chooses an approach like the one currently being taken in the Jigsaw prototype, will Maven and OSGi be made obsolete? A No, not at all! No matter what approach is taken, to ensure wide adoption it’s essential that the standard Java Platform Module System interact well with Maven. Applications that depend upon the sophisticated features of OSGi will no doubt continue to use OSGi, so it’s critical that implementations of OSGi be able to run on top of the Java module system and, if suitably modified, support OSGi bundles that depend upon Java modules. Ideas for how to do that are currently being explored in Project Penrose. Java 8 & Java 9 Q Without Jigsaw, won’t Java 8 be a pretty boring release? A No, far from it! It’s still slated to include the widely-anticipated Project Lambda (JSR 335), work on which has been going very well, along with the new Date/Time API (JSR 310), Type Annotations (JSR 308), and a set of smaller features already in progress. Q Won’t deferring Jigsaw to Java 9 delay the eventual convergence of the higher-end Java ME Platforms with Java SE? A It will slow that transition, but it will not stop it. To allow progress toward that convergence to be made with Java 8 I’ve suggested to the Java SE 8 EG that we consider specifying a small number of Profiles which would allow compact configurations of the SE Platform to be built and deployed. Q If Jigsaw is deferred to Java 9, would the Oracle engineers currently working on it be reassigned to other Java 8 features and then return to working on Jigsaw again after Java 8 ships? A No, these engineers would continue to work primarily on Jigsaw from now until Java 9 ships. Q Why not drop Lambda and finish Jigsaw instead? A Even if the engineers currently working on Lambda could instantly switch over to Jigsaw and immediately become productive—which of course they can’t—there are less than nine months remaining in the Java 8 schedule for work on major features. That’s just not enough time for the broad review, testing, and feedback which such a fundamental change to the Java Platform requires. Q Why not ship the module system in Java 8, and then modularize the platform in Java 9? A If we deliver a module system in one release but don’t use it to modularize the JDK until some later release then we run a big risk of getting something fundamentally wrong. If that happens then we’d have to fix it in the later release, and fixing fundamental design flaws after the fact almost always leads to a poor end result. Q Why not ship Jigsaw in an 8.5 release, less than two years after 8? Or why not just ship a new release every year, rather than every other year? A Many more developers work on the JDK today than a couple of years ago, both because Oracle has dramatically increased its own investment and because other organizations and individuals have joined the OpenJDK Community. Collectively we don’t, however, have the bandwidth required to ship and then provide long-term support for a big JDK release more frequently than about every other year. Q What’s the feedback been on the two-year release-cycle proposal? A For just about every comment that we should release more frequently, so that new features are available sooner, there’s been another asking for an even slower release cycle so that large teams of enterprise developers who ship mission-critical applications have a chance to migrate at a comfortable pace.

    Read the article

  • Don't miss Virtual Developer Day - All about ADF next week

    - by Shay Shmeltzer
    In case you haven't heard we are holding a free online virtual developer day next week - July 10th that you should attend - even if you think you already know ADF. First the registration link - http://bit.ly/fusiondev. While one of the tracks is aimed at developer who are relatively new to ADF - and cover ADF Faces, ADF Controller and a comparison of productivity with Forms and other tools - the two other tracks have great content on some topics that you might not be familiar with even if you already work with ADF. This include sessions about the upcoming ADF Mobile, The new ADF support in Eclipse and information about Application Life Cycle Management with ADF and JDeveloper. As well as sessions that will open your mind to the areas where ADF integrates with other Fusion Middleware Solutions such as ADF integration with BI, WebCenter and SOA. Most of the sessions are quite heavy on demos and you'll get a chance to interact with the presenters and ask questions during the live event. You should register even if you can't attend the live event - this way you'll get an email pointing you to the recorded sessions for on demand viewing. See you next week.

    Read the article

  • Oracle Solutions supporting ICAM deployments

    - by user12604761
    The ICAM architecture has become the predominant security architecture for government organizations.  A growing number of federal, state, and local organizations are in various stages of using Oracle ICAM solutions.  The relevance of ICAM has clearly extended beyond the Federal ICAM mandates to any government program that must enable standards based interoperability like health exchanges and public safety.  The state government endorsed version of ICAM was just released with the NASCIO SICAM Roadmap. ICAM solutions require an integrated security architecture.  The major new release in August of Oracle Identity Management 11gR2 focuses on a platform approach to identity management.  This makes it easier for government organizations to acquire and implement a comprehensive ICAM solution, rather than individual products.  The following analysts reports describe the value of the Oracle Solutions: According to The Aberdeen Group:  “Organizations can save up to 48% deploying a platform of  (identity management) solutions when compared to deploying point solutions” IDC Product Flash, July 2012:  “Oracle may have hit the home run grand slam in identity management recently with the announcement of Oracle Identity Management 11g R2." For additional information on the Oracle ICAM solutions, attend the Webcast on October 10, 2012:  ICAM Framework for Enabling Agile, Service Delivery. Visit the Oracle Secure Government Resource Center for information on enterprise security solutions that help government safeguard information, resources and networks.

    Read the article

  • Why should I use MSBuild instead of Visual Studio Solution files?

    - by Sid
    We're using TeamCity for continuous integration and it's building our releases via the solution file (.sln). I've used Makefiles in the past for various systems but never msbuild (which I've heard is sorta like Makefiles + XML mashup). I've seen many posts on how to use msbuild directly instead of the solution files but I don't see a very clear answer on why to do it. So, why should we bother migrating from solution files to an MSBuild 'makefile'? We do have a a couple of releases that differ by a #define (featurized builds) but for the most part everything works. The bigger concern is that now we'd have to maintain two systems when adding projects/source code. UPDATE: Can folks shed light on the lifecycle and interplay of the following three components? The Visual Studio .sln file The many project level .csproj files (which I understand an "sub" msbuild scripts) The custom msbuild script Is it safe to say that the .sln and .csproj are consumed/maintained as usual from within the Visual Studio IDE GUI while the custom msbuild script is hand-written and usually consumes the already existing individual .csproj "as-is"? That's one way I can see reduce overlap/duplicate in maintenance... Would appreciate some light on this from other folks' operational experience

    Read the article

  • Release: Oracle Java Development Kit 8, Update 20

    - by Tori Wieldt
    Java Development Kit 8, Update 20 (JDK 8u20) is now available. This latest release of the Java Platform continues to improve upon the significant advances made in the JDK 8 release with new features, security and performance optimizations. These include: new enterprise-focused administration features available in Oracle Java SE Advanced; products offering greater control of Java version compatibility; security updates; and a very useful new feature, the MSI compatible installer. Download Release Notes Java SE 8 Documentation New tools, features and enhancements highlighted from JDK 8 Update 20 are: Advanced Management Console The Java Advanced Management Console 1.0 (AMC) is available for use with the Oracle Java SE Advanced products. AMC employs the Deployment Rule Set (DRS) security feature, along with other functionality, to give system administrators greater and easier control in managing Java version compatibility and security updates for desktops within their enterprise and for ISVs with Java-based applications and solutions. MSI Enterprise JRE Installer Available for Windows 64 and 32 bit systems in the Oracle Java SE Advanced products, the MSI compatible installer enables system administrators to provide automated, consistent installation of the JRE across all desktops in the enterprise, free of user interaction requirements. Performance: String de-duplication resulting in a reduced footprint Improved support in G1 Garbage Collection for long running apps. A new 'force' feature in DRS (Deployment Rule Set) which allows system administrators to specify the JRE with which an applet or Java Web Start application will run. This is useful for legacy applications so end users don't need to approve security exceptions to run.  Java Mission Control 5.4 with new ease-of-use enhancements and launcher integration with Eclipse 4.4 JavaFX on ARM Nashorn performance improvement by persisting bytecode after inital compilation There's much more information to be found in the JDK 8u20 Release Notes.

    Read the article

  • Two Cloudy Observations from Oracle OpenWorld

    - by GeneEun
    Now that the dust has settled from another amazing Oracle OpenWorld, I wanted to reflect back on a couple of key observations I made during the event. First, it was pretty clear that Cloud was again a big deal at this year's conference. Yes, the Oracle Database 12c announcement was also huge, but for most it was hard to not notice that Oracle continues to be "all-in" with respect to cloud computing. Just to give you an idea of the emphasis on Cloud, there were over 300 Cloud-related sessions at this year's OpenWorld. If you caught some of the demo booths in the Oracle Red Lounge, then you saw some of the great platform, application, and social services that are now part of Oracle Cloud, as well as numerous demos of private cloud products that Oracle offers. Second, during Thomas Kurian's keynote presentation on Oracle Cloud, he announced the Preview Availability of a new service called Oracle Developer Cloud Service. This new platform service will provide developers with instant access to environments to better manage the application development lifecycle in the cloud. It provides development project teams access to favorite tools like Hudson, Git, Github, wikis, and tasks to help make innovation faster, more collaborative, and more effective. There's also integration with IDEs like Eclipse, NetBeans, and JDeveloper. If you're a developer, it's an awesome addition to Oracle Cloud's platform services! Want more details about Oracle Developer Cloud Service? Click here.

    Read the article

  • Most suited technology for browser games?

    - by Tingle
    I was thinking about making a 2D MMO which I would in the long run support on various plattforms like desktop, mac, browser, android and ios. The server will be c++/linux based and the first client would go in the browser. So I have done some research and found that webgl and flash 11 support hardware accelerated rendering, I saw some other things like normal HTML5 painting. So my question is, which technology should I use for such a project? My main goal would be that the users have a hassle free experience using what there hardware can give them with hardware acceleration. And the client should work on the most basic out-of-the-box pc's that any casual pc or mac user has. And another criteria would be that it should be developer friendly. I've messed with webgl abit for example and that would require writing a engine from scratch - which is acceptable but not preferred. Also, in case of non-actionscript, which kind language is most prefered in terms of speed and flexability. I'm not to fond of javascript due to the garbage collector but have learned to work around it. Thank you for you time.

    Read the article

  • ArchBeat Link-o-Rama for October 23, 2013

    - by OTN ArchBeat
    Virtual Dev Day: Oracle ADF Development - Web, Mobile, and Beyond This free virtual event includes technical sessions that range from introductory to deep dive, covering Oracle ADF and Oracle ADF Mobile. Multiple tracks cover every interest and every level and include live online Q&A for answers to your technical questions. Register now! Americas: Tuesday, November 19, 9am-1pm PT / 12pm-4pm ET / 1pm-5pm BRT APAC: Thursday, November 21, 10am–1:30pm IST (India) / 12:30pm–4pm SGT (Singapore) / 3:30pm–7pm AESDT EMEA: Tuesday, November 26, 9am-1pm GMT / 1pm-5pm GST/ 2:30pm-6:30pm IST A Roadmap for SOA Development and Delivery | Mark Nelson Do you know the way to S-O-A? Mark Nelson does. His latest blog post, part of an ongoing series, will help to keep you from getting lost along the way. Updated ODI Statement of Direction | Robert Schweighardt Heads up Oracle Data Integrator fans! A new product statement of direction document is available, offering "an overview of the strategic product plans for Oracle’s data integration products for bulk data movement and transformation, specifically Oracle Data Integrator (ODI) and Oracle Warehouse Builder (OWB)." Java-Powered Robot Named NAO Wows Crowds | Tori Wieldt Java community manager Tori Wieldt interviews a robot and human. Nordic OTN Tour 2013 | Lonneke Dikmans Oracle ACE Director Lonneke Dikmans checks in from the Stockholm leg of the Nordic OTN Tour for 2013, sponsored by the Danish Oracle User Group and featuring fellow ACE Directors Tim Hall and Sten Vesterli, plus local speakers at various stops. Lonneke's post include the slides from three of the presentations. Thought for the Day "Some people approach every problem with an open mouth." — Adlai E. Stevenson23rd Vice President of the United States (October 23, 1835 – June 14, 1914) Source: brainyquote.com

    Read the article

  • How to approach scrum task burn down when tasks have multiple peoples involvement?

    - by AgileMan
    In my company, a single task can never be completed by one individual. There is going to be a separate person to QA and Code Review each task. What this means is that each individual will give their estimates, per task, as to how much time it will take to complete. The problem is, how should I approach burn down? If I aggregate the hours together, assume the following estimate: 10 hrs - Dev time 4 hrs - QA 4 hrs - Code Review. Task Estimate = 18hrs At the end of each day I ask that the task be updated with "how much time is left until it is done". However, each person generally just thinks about their part of it. Should they mark the effort remaining, and then ADD the effort estimates to that? How are you guys doing this? UPDATE To help clarify a few things, at my organization each Task within a story requires 3 people. Someone to develop the task. (do unit tests, ect...) A QA specialist to review task (they primarily do integration and regression tests) A Tech lead to do code review. I don't think there is a wrong way or a right way, but this is our way ... and that won't be changing. We work as a team to complete even the smallest level of a story whenever possible. You cannot actually test if something works until it is dev complete, and you cannot review the quality of the code either ... so the best you can do is split things up into small logical slices so that the bare minimum functionality can be tested and reviewed as early into the process as possible. My question to those that work this way would be how to burn down a "task" when they are setup this way. Unless a Task has it's own sub-tasks (which JIRA doesn't allow) ... I'm not sure the best way to accomplish tracking "what's left" on a daily basis.

    Read the article

  • SOA performance on SPARC T5 benchmark results

    - by JuergenKress
    The brand NEW super fast SPARC T5 servers are available. The platform is superb to run large SOA Suite environments or to consolidate your whole middleware platform. Some performance advices, recommended for all workloads: Performance profile for SOA apps on Oracle Solaris 11 BPEL (Fusion Order Demo) instances per second OSB (messages / transformations per second) Crypto acceleration study for SOA transformations SPARC T4 and T5 platform testing, pre-tuning Performance suitable for mid-to-high range enterprise in stand-alone SOA deployment or virtualized consolidation environment shared with Oracle applications 2.2x to 5x faster than SPARC T3 servers 25% faster SOA throughput, core to core than Intel 5600-series servers (running Exalogic software) SPARC T5 has 2x the consolidation density of Intel 5600-class processors 2x faster initial deployment time using Optimized Solutions pre-tested configuration steps Over 200 Application adapters for easiest Oracle software integration Would you like to get details? We can share with you on 1:1 bases T5 SOA Suite performance benchmarks, please contact your local partner manager or myself! SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Mix Forum Technorati Tags: T5,TS Sparc,T5 SOA,bechmark,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

< Previous Page | 328 329 330 331 332 333 334 335 336 337 338 339  | Next Page >