Search Results

Search found 9764 results on 391 pages for 'cross join'.

Page 257/391 | < Previous Page | 253 254 255 256 257 258 259 260 261 262 263 264  | Next Page >

  • Use a SQL Database for a Desktop Game

    - by sharethis
    Developing a Game Engine I am planning a computer game and its engine. There will be a 3 dimensional world with first person view and it will be single player for now. The programming language is C++ and it uses OpenGL. Data Centered Design Decision My design decision is to use a data centered architecture where there is a global event manager and a global data manager. There are many components like physics, input, sound, renderer, ai, ... Each component can trigger and listen to events. Moreover, each component can read, edit, create and remove data. The question is about the data manager. Whether to Use a Relational Database Should I use a SQL Database, e.g. SQLite or MySQL, to store the game data? This contains virtually all game content like items, characters, inventories, ... Except of meshes and textures which are even more performance related, so I will keep them in memory. Is a SQL database fast enough to use it for realtime reading and writing game informations, like the position of a moving character? I also need to care about cross-platform compatibility. Aside from keeping everything in memory, what alternatives do I have? Advantages Would Be The advantages of using a relational database like MySQL would be the data orientated structure which allows fast computation. I would not need objects for representing entities. I could easily query data of objects near the player needed for rendering. And I don't have to take care about data of objects far away. Moreover there would be no need for savegames since the hole game state is saved in the database. Last but not least, expanding the game to an online game would be relative easy because there already is a place where the hole game state is stored.

    Read the article

  • Corporate Efficiency

    - by AndyScott
    Thoughts on streamlining the process of getting someone up to speed when they join a project as a new hire; or as is common in some companies, switch from one project to another: Has anyone heard of a strategy (including emphasis towards consistent, ongoing documentation) that would bring a user up to speed quickly? Has there been any thought given to focused documentation, specific to a role within a project? Or formalized mentoring within a project, that goes beyond a “system walkthrough”?   Often it's overlooked what time is wasted when a senior level worker is brought on board.  It's assumed that they will know the right questions to ask. They are the type of people that normally learn quickly, and in their own ways, so let them get by with what's out there.   Having a user without a computer will cost you measurable worker hours, making it an easy target to shoot at (and rightly so). Not getting them up to speed as quickly as possible is an efficiency issue, that seems to have become an industry standard as an accepted loss. Given the complexity of the projects within most companies, and the frequency with which users are shifted from one project to another based on need; I think this is an area that bears consideration.

    Read the article

  • More Tables or More Databases?

    - by BuckWoody
    I got an e-mail from someone that has an interesting situation. He has 15,000 customers, and he asks if he should have a database for their data per customer. Without a LOT more data it’s impossible to say, of course, but there are some general concepts to keep in mind. Whenever you’re segmenting data, it’s all about boundary choices. You have not only boundaries around how big the data will get, but things like how many objects (tables, stored procedures and so on) that will be involved, if there are any cross-sections of data (do they share location or product information) and – very important – what are the security requirements? From the answer to these types of questions, you now have the choice of making multiple tables in a single database, or using multiple databases. A database carries some overhead – it needs a certain amount of memory for locking and so on. But it has a very clean boundary – everything from objects to security can be kept apart. Having multiple users in the same database is possible as well, using things like a Schema. But keeping 15,000 schemas can be challenging as well. My recommendation in complex situations like this is similar to a post on decisions that I did earlier – I lay out the choices on a spreadsheet in rows, and then my requirements at the top in the columns. I  give each choice a number based on how well it meets each requirement. At the end, the highest number wins. And many times it’s a mix – perhaps this person could segment customers into larger regions or districts or products, in a database. Within that database might be multiple schemas for the customers. Of course, he needs to query across all customers, that becomes another requirement. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Oracle College Rehire Program -China by Camilla!!

    - by Nadiya
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 In China,for the R&D campus hire, we launched a Oracle College Hire Program, all the new graduates in R&D team could join it, the purpose is to let them understand Oracle’s culture and value, get them quickly to be familiar and productive on their new work, provide meaningful events and get them engaged.  They’re divided into classes by location, each class would have around 20 people, and each class would have a monitor, who is in charge of the whole class activity, the program has 3 modules, including social activity, Speaker Series and Career Development. The pictures show one class, which is having the social activity session,exciting isnt it?   /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • 11/28 Thought Leaders Webinar: Marketing Strategies for Great Customer Experiences

    - by Charles Knapp
    With the growing use of mobile and social, it's tempting to bolt on these new channels to existing processes. However, that piecemeal approach may not lead to satisfying customer experiences or solid returns on investments. Furthermore, the volume of information businesses have access to is growing exponentially. Is this leading to better business insight and customer experiences? Join the Internet Marketing Association, The University of California at Irvine, and Oracle as we discuss marketing strategies that will help your customers have better experiences with your brand. You'll learn effective strategies for harnessing the power of "big data" to know more and understand your customers better, empowering customers and employees to make every interaction easy and rewarding, and adapting the customer experience to connect and engage effectively with each customer. Our speakers are Melissa Boxer, Vice President of Product Strategy, Oracle Cloud and CX Applications, who is a conference keynote speaker on integrated social marketing and loyalty analytics, and Dean Abbott, CEO of Abbott Analytics, who is a thought leader in commercial predictive analytics. This learning opportunity takes place on Wednesday, November 28, 11 am to 12 pm Pacific. Register today to learn from these thought leaders.

    Read the article

  • Choosing a JavaScript Asynch-Loader

    - by Prisoner ZERO
    I’ve been looking at various asynchronous resource-loaders and I’m not sure which one to use yet. Where I work we have disparate group-efforts whose class-modules may use different versions of jQuery (etc). As such, nested dependencies may differ, as well. I have no control over this, so this means I need to dynamically load resources which may use alternate versions of the same library. As such, here are my requirements: Load JavaScript and CSS resource files asynchronously. Manage dependency-order and nested-dependencies across versions. Detect if a resource is already loaded. Must allow for cross-domain loading (CDN's) (optional) Allow us to unload a resource. I’ve been looking at: Curl RequireJS JavaScriptMVC LABjs I might be able to fake these requirements myself by loading versions into properly-namespaced variables & using an array to track what is already loaded...but (hopefully) someone has already invented this. So my questions are: Which ones do you use? And why? Are there others that my satisfy my requirements fully? Which do you find most eloquent and easiest to work with? And why?

    Read the article

  • How can I get non-programmer colleagues on board with bespoke software rather than Dynamics CRM + Sharepoint?

    - by Bendos
    I am working with a company which designs and builds one-off machines. They have been 'dabbling' with hosted Dynamics CRM and Sharepoint (on different servers!) in an attempt to centralise their data and help colleagues collaborate more effectively across projects. They haven't used either system to their potential. Now we are looking at the engineering department who already use a form of version control software for the various CAD files (Autodesk Vault) however it is becoming increasingly necessary to implement more of a generic file version control system as they use many more files than can be managed in Vault (sometimes just photos or scans of paper documents), hence why they were looking at using Sharepoint. However... as the 'programmer' of the bunch, I can see several scenarios which don't seem to fit well with the Dynamics + Sharepoint approach; simple reports based on cross-table queries, exporting certain metrics as a spreadsheet, defining project hierarchies and many-many relationships, and as such I have been pushing for an in-house developed 'ECM' / 'ERP' software package (perhaps in .NET or php). Some colleagues seem to attach a greater value to the MS software (perhaps becuase it has a logo!) but don't see that it's just a framework, not a solution. Can anyone provide a good example of when custom software would actually be better than using Dynamics + Sharepoint and how do I relate that to non-technical staff?

    Read the article

  • Oracle Enterprise Manager 12c Testing-as-a-Service Solution

    - by user810030
    With organizations spending as much as 50 percent of their QA time with non-test related activities like setting up hardware and deploying applications and test tools, the cloud will bring obvious benefits. A key component of Oracle Enterprise Manager our current Application Quality Management products have been helping our customers with application load testing, functional testing and test process management, but also test data management, data masking and real application testing. These products enable customers to thoroughly test applications and their underlying infrastructure to help ensure the best quality, scalability and availability prior to deployment.  Today, Oracle announced Oracle Enterprise Manager 12c Testing-as-a-Service Solution . This solution will allow users to significantly decrease the time needed to setup a complete test environment, while enhancing testing efficiency. Please read the Press Release mentioned above and join us in our Enterprise Manager LinkedIn Group discussion on this topic. (need to be a member). Or visit our booth this week during the EuroSTAR Software Testing conference in Amsterdam where we can demo this solution  I hope you find this helpfull Stay Connected: Twitter |  Facebook |  YouTube |  Linkedin |  Newsletter

    Read the article

  • Take a Tour of the Future

    - by Tom Caldecott-Oracle
    Visit Our HQ Usability Lab During Oracle OpenWorld 2014 You want to look behind the scenes at the Oracle Applications User Experience Usability Lab on the campus of our headquarters. No problem. You’re invited to join an exclusive tour. When? Thursday, October 2 or Friday, October 3. Where? Redwood Shores, Calif.  And what will you see on the tour? The future—how we test future product designs and the advanced technology we use to do that. You’ll also view early demos of upcoming enterprise software designs for tables and mobile phones.  We’ll provide round-trip transportation, with the pickup and drop-off point being the InterContinental San Francisco.  Space is limited, so reserve your spot now. Want to know more about the tour and other Oracle Applications User Experience activities at Oracle OpenWorld? Visit UsableApps. Welcome to the future.

    Read the article

  • Release Notes for 6/14/2012

    Here are the notes for this week’s release: Diffs in Pull Requests and Commits We altered the way we display diffs across commits and pull requests to maximize the amount of vertical real estate devoted to the diff. Before, the viewport for diffs was always snapped to the height of the browser, which meant that on lower resolutions, the amount of space for viewing diffs could become very tiny. Now, the majority of the browser vertical space is devoted to viewing the diffs. Let us know what you think! Bug Fixes Fixed an issue where returning to the list of files changed from a diff would sometimes not show the list of files. Fixed the dialogs for approving and denying requests to join projects. Fixed various issues around validation of project details when publishing a project. Fixed an issue that caused the formatting of our tabs in pull requests to not display properly. Fixed an issue where users browsing Unicode files in a Git project would see error pages. Fixed various issues where the option to subscribe to notifications would not appear properly. Have ideas on how to improve CodePlex? Visit our ideas page! Vote for your favorite ideas or submit a new one. Got Twitter? Follow us and keep apprised of the latest releases and service status at @codeplex.

    Read the article

  • On installing nvidia drivers on 12.10 I get "Bad return status for module build on kernel: 3.5.0-19-generic (x86_64)"

    - by james
    New Ubuntu user - just recently made the mistake of trying a different nvidia driver. I'd managed to get the last (nvidia-current) one working through software sources a few weeks ago. The other day I tried to cross over to nvidia-experimental-310 and this produced a system error. Swapping back and forth between proprietary drivers now always causes an error and I can't get any of them to work. Installing through the terminal I get this error message every time: Building initial module for 3.5.0-19-generic Error! Bad return status for module build on kernel: 3.5.0-19-generic (x86_64) Consult /var/lib/dkms/nvidia-experimental-310/310.14/build/make.log for more information On rebooting, I end up with the crappy screen resolution and the thick black border around the screen. I use gksudo software-properties-gtk to bring up sources, where I can change back to the nouveau driver, which restores my screen. After that I can't find /var/lib/dkms/nvidia-experimental-310/310.14/build/make.log so I can't tell you what's inside. Any ideas what might be preventing the nvidia driver from installing? SOLUTION FOUND Okay - so I have a workaround. This is what has worked: Upgrade to kernel 3.7.0 as detailed here upgrade to latest version of the nvidia drivers as detailed here No idea what was happening with kernel 3.5.0-19, but this seems to be better. A little slower maybe on boot, but after days of messing around it's nice to have something that works.

    Read the article

  • Announcing Hackathon for Social Developers

    - by Mike Stiles
    Continuing our Social Developer theme, we're excited to announce a week long hackathon put on by the Oracle Social Developer Lab (OSDL). The event starts at JavaOne Oct 2nd and runs through Oct 9th. A winner will be announced and profiled in the following issue of Java Magazine. What's it about?The OSDL is on a mission to make social development easier for the Java community. You may have noticed the biggest social networks have created tools for Ruby, PHP, and other languages, but not as much for Java. We've decided to help fill the gap with a SocialLink social publishing library. You can learn more about it on Java.net. We're also interested in promoting other tools that facilitate social development such as DaliCore Framework.  For our hack, you've got one week to leverage our library and/or DailCore to create a social app. The only rules are it must be a new application, and it must leverage one or both of these tools.  How to submit Create a project that uses either the SocialLink library or the DaliCore Framework to read or publish social data. 1. Upload your hack to a new project on java.net 2. Submit the URL to your java.net project through the project submission form on the Oracle Social Developer Community Facebook page. Only projects that have been submitted to the Oracle Social Developer Community will be reviewed.  In addition to the review process, we'll be adding some projects to the SocialLink project as a "sibling" project. Should you participate?If you're a developer who aspires to integrate some social functionality into your Java application, then yes!  How else can I participate with OSDL?If you're not ready to participate in the hackathon but have ideas for how we can make social development easier for the Java community, come join our social developer community on Facebook. 

    Read the article

  • Developing an Interface to a Dynamic System

    - by radix07
    I work for a small company and have been designing a GUI to interface our embedded system. The problem with this embedded system is that it is not a finished product (may never be) and is constantly under development and being tweaked and updated for different customers and applications in small volumes. So to deal with this I made a program that can export all the data from a spreadsheet where most of the embedded system variables are sourced from and throw them into a small database for the GUI application to use. This database program I made also spits out a cross reference file for the embedded system which allows the GUI to look up all the variables. This system works pretty well so far, and is even integrated with version control among the GUI, database, and embedded system. The big problem is that there is constant development on several projects that use this system and it gets terribly tedious to keep the system up to date and bring in new changes. This has gotten to the point to where I have had to code the GUI to dynamically (generically) generate all interfaces since I am never guaranteed to find the same data the same way. I have not been able to come up with a good way to uniquely identify the data I import from excel since all fields are able to be changed (due to engineering stubbornness, code re-factoring and/or excel issues) and I cannot assign a fixed reference within the sheet itself. So, are there any good methods or ideas on how to handle the chaos?

    Read the article

  • Spend Analytics on a Grand Scale

    - by jacqueline.coolidge(at)oracle.com
    The Wall St. Journal reports in Billions in Bloat Uncovered in Beltway that a recent study by Government Accountability Office (GAO) has released a massive study of several programs and agencies that cost U.S. taxpayers billions of dollars each year.  This report help save $100 to $200 Billion dollars by identifying duplicate spending and ineffective programs that can be consolidated or eliminated. Now, that is spend analytics on a massive scale! It remains to be seen how actionable that information will be.  Certainly, there have been studies before that identify wasteful spending.  But, it’s a great case of the power of business intelligence and spend analytics.   Many companies do find significant savings when they implement spend and procurement analytics. What makes for an excellent spend analysis? It should be: Objective and provide visibility across programs and/or divisions A cross functional analysis that links financial with performance metrics Prescriptive and actionable Spend and procurement analytics have been HOT during the economic downturn! I expect 2011 will see many more companies get serious about spend analytics and would love to hear from companies who are willing to share their experience.

    Read the article

  • Quiz Master at Beyond Relational

    - by Vincent Maverick Durano
    Last month a friend of mine invited me to join BeyondRelational.com and asked me to nominate myself as a .NET Quiz Master. In order to qualify I must submit an interesting question related to .NET and their .NET team will review the information and will select 31 quiz masters for the .NET quiz category. This seems insteresting to me so I go ahead and submit one entry. Luckily I was selected as one of the 31 Quiz Masters in the .NET category. I hope to be able to keep up the good work there for years to come. Big Thanks to Jacob Sebastian and his Team! And oh.. I didn't get a changce to blog about this last week but just to let you guys know that the .NET General Quiz just started last january 1st 2011. The quiz will be a series of 31 questions, managed by 31 .NET quiz masters. Each quiz master will ask one question and will moderate the discussion and answers and finally will identify the winner of each quiz. Each answer that is correct will get a certain score ranging from 1 to 10 where 10 is the highest. The scores of all 31 questions will be added up to identify the final winner. So what are you waiting for? Sign-up and register now and get a changce to win some exciting prizes! Technorati Tags: Community

    Read the article

  • MySQL documentation writer wanted

    - by stefanhinz
    As MySQL is thriving and growing, we're looking for an experienced technical writer located in Europe or North America to join the MySQL documentation team.For this job, we need the best and most dedicated people around. You will be part of a geographically distributed documentation team responsible for the technical documentation of all MySQL products. Team members are expected to work independently, requiring discipline and excellent time-management skills as well as the technical facilities to communicate across the Internet.Candidates should be prepared to work intensively with our engineers and support personnel. The overall team is highly distributed across different geographies and time zones. Our source format is DocBook XML. We're not just writing documentation, but also handling publication. This means you should be familiar with DocBook, and willing to learn our publication infrastructure.Candidates should therefore be interested not just in writing but also in the technical aspects of publishing documentation. Regarding your initial areas of authoring, those would be MySQL Cluster, MySQL Enterprise Monitor and Backup, and various parts of the MySQL server documentation (also known as the MySQL Reference Manual). This means you should be familiar with MySQL in general, and preferably also with MySQL Cluster and the MySQL Enterprise offerings.Other qualifications: Native English speaker 3 or more years previous experience in writing software documentation Excellent written and oral communication skills Ability to provide (online) samples of your work, e.g. books or articles Curiosity to learn new technologies Familiarity with distributed working environments and versioning systems such as SVN Comfortable with working on multiple operating systems, particularly Windows, Mac OS X, and Linux Ability to administer own workstations and test environment If you're interested, contact me under [email protected].

    Read the article

  • Oracle Weblogic 12c for New Projects–Webcast November 7th 2013

    - by JuergenKress
    Fast-growing organizations need to stay agile in the face of changing customer, business or market requirements. Oracle WebLogic Server 12c is the industry's best application server platform that allows you to quickly develop and deploy reliable, secure, scalable and manageable enterprise Java EE applications. WebLogic Server Java EE applications are based on standardized, modular components. WebLogic Server provides a complete set of services for those modules and handles many details of application behavior automatically, without requiring programming. New project applications are created by Java programmers, Web designers, and application assemblers. Programmers and designers create modules that implement the business and presentation logic for the application. Application assemblers assemble the modules into applications that are ready to deploy on WebLogic Server. Build and run high-performance enterprise applications and services with Oracle WebLogic Server 12c, available in three editions to meet the needs of traditional and cloud IT environments. Join us, in this webcast, as we will show you how WebLogic Server 12c helps you building and deploying enterprise Java EE applications with support for new features for lowering cost of operations, improving performance, enhancing scalability. Agenda Oracle WebLogic Server Introduction Application Development on WebLogic Using Java EE Overview of the Application Deployment Process Monitoring Application Performance Q&A November 07th, 2013   9am UTC/11am EET REGISTER NOW WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: education,WebLogic,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • ASP.NET MVC Cookbook - public review

    - by asiemer
    I have recently started writing another book.  The topic of this book is ASP.NET MVC.  This book differs from my previous book in that rather than working towards building one project from end to end - this book will demonstrate specific topics from end to end.  It is a recipe book (hence the cookbook name) and will be part of the Packt Publishing cookbook series.  An example recipe in this book might be how to consume JSON, creating a master /details page, jquery modal popups, custom ActionResults, etc.  Basically anything recipe oriented around the topic of ASP.NET MVC might be acceptable.  If you are interested in helping out with the review process you can join the "ASP.NET MVC 2 Cookbook-review" group on Google here: http://groups.google.com/group/aspnet-mvc-2-cookbook-review Currently the suggested TOC for the project is listed.  Also, chapters 1, 2, and most of 8 are posted.  Chapter 5 should be available tonight or tomorrow. In addition to reporting any errors that you might find (much appreciated), I am very interested in hearing about recipes that you want included, expanded, or removed (as being redundant or overly simple).  Any input is appreciated!  Hearing user feedback after the book is complete is a little late in my opinion (unless it is positive feedback of course). Thank you!

    Read the article

  • How do I configure an Intel HD Graphics 4000?

    - by derabbink
    First off, please note that last night I already posted this question to a launchpad mailing list, so this could be considered a cross post. However, I think this is a better place to ask the same question The question: How can I configure my Ubuntu 12.04, with upgraded kernel (3.6), to use the Intel HD Graphics 4000 adapter? (Intel HD 4000 is the standard of 3rd gen Intel Core i7 (Ivy Bridge) graphics adapter) Some output: $ glxinfo name of display: :0 X Error of failed request: BadRequest (invalid request code or no such operation) Major opcode of failed request: 154 (GLX) Minor opcode of failed request: 19 (X_GLXQueryServerString) Serial number of failed request: 12 Current serial number in output stream: 12 $ cat /etc/X11/xorg.conf this is probably the farthest from what it should be Section "Screen" Identifier "Default Screen" DefaultDepth 24 EndSection Section "Module" Load "glx" EndSection $ lspci I only listed the line I think are relevant. If you want more info in order to help me, please comment :) 00:02.0 VGA compatible controller: Intel Corporation Ivy Bridge Graphics Controller (rev 09) 00:1b.0 Audio device: Intel Corporation Panther Point High Definition Audio Controller (rev 04) 16:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Whistler XT [AMD Radeon HD 6700M Series] 16:00.1 Audio device: Advanced Micro Devices [AMD] nee ATI Turks HDMI Audio [Radeon HD 6000 Series]

    Read the article

  • JavaOne Gangnam Style

    - by Tori Wieldt
    Yes, JavaOne is *the* place for excellent content, including technical information, opportunities to learn best practices from your peers, and access to industry experts. You can find lots of information about content in Java Evangelist Arun Gupta's 25 Reasons to attend JavaOne 2012. But you also have to let your Gangnam Style loose. Here are the Top Ten Fun Reasons to attend JavaOne 2012: 10. Connect with developers from more than 80 countries 9. Kick off the week at GlassFish and Friends Party Sunday night 8. Meet the community of Java Rock Stars 7. Enjoy all San Francisco has to offer 6. Meet your next best friend playing pinball in the Game Zone 5. Have your picture taken with Duke 4. Java in the morning and brews in the afternoon at the Taylor Street Cafe 3. Ride across the Golden Gate Bridge at the Community Geek Bike Ride 2. Rock out at the first-ever Oracle OpenWorld Music Festival and #1... 1. It beats being at work!  If you haven't registered, there's still time. Join us!

    Read the article

  • Handling commands or events that wait for an action to be completed afterwards

    - by virulent
    Say you have two events: Action1 and Action2. When you receive Action1, you want to store some arbitrary data to be used the next time Action2 rolls around. Optimally, Action1 is normally a command however it can also be other events. The idea is still the same. The current way I am implementing this is by storing state and then simply checking when Action2 is called if that specific state is there. This is obviously a bit messy and leads to a lot of redundant code. Here is an example of how I am doing that, in pseudocode form (and broken down quite a bit, obviously): void onAction1(event) { Player = event.getPlayer() Player.addState("my_action_to_do") } void onAction2(event) { Player = event.getPlayer() if not Player.hasState("my_action_to_do") { return } // Do something } When doing this for a lot of other actions it gets somewhat ugly and I wanted to know if there is something I can do to improve upon it. I was thinking of something like this, which wouldn't require passing data around, but is this also not the right direction? void onAction1(event) { Player = event.getPlayer() Player.onAction2(new Runnable() { public void run() { // Do something } }) } If one wanted to take it even further, could you not simply do this? void onPlayerEnter(event) { // When they join the server Player = event.getPlayer() Player.onAction1(new Runnable() { public void run() { // Now wait for action 2 Player.onAction2(new Runnable() { // Do something }) } }, true) // TRUE would be to repeat the event, // not remove it after it is called. } Any input would be wonderful.

    Read the article

  • Oracle OpenWorld Recap - A Walk in the Clouds (and heat in San Francisco)!

    - by Di Seghposs
    Whether you were one of the 50,000 attendees in San Francisco or one of the million+ online attendees – we’d like to thank you for joining us at Oracle OpenWorld last week! With temperatures in the 80s and 90s, attendees traveled the overheated streets to join packed keynotes and general sessions – all to find the information they came in search of – Oracle solutions to address their business requirements and challenges. The buzz of this year’s OpenWorld was all about ‘The Cloud’. And, the financial management team joined in the cloud buzz with Thomas Kurian’s keynote which highlighted our ERP Cloud Service as the most complete cloud service on the market. Offering the full breadth of business operations, including Financial Management, Risk and Control Management, Project Portfolio Management, Procurement, Sourcing, and Inventory Management, Oracle ERP Cloud Service transforms the back office into a collaborative, efficient, and intuitive hub. And, our product marketing expert on Financial Management, Annette Melatti, provided a glimpse of what the office of finance looks like in the 21st century as well as shared what’s next for Oracle’s financial solutions discussing the future of Financial Management with Fusion Financials, E-Business Suite, PeopleSoft and the JD Edwards solutions. There were over 120 sessions from customers, partners, and Oracle experts that addressed financial management solutions along with demo pods and Meet the Experts sessions. We hope you found what you were looking for! Missed any of the keynotes or general sessions? Watch them on demand here. At OpenWorld, we also announced that Lending Club, the leading platform for investing in and obtaining personal loans, has selected Oracle ERP Cloud Service to help improve decision-making, implement robust reporting, and take advantage of the cost savings provided by the cloud. The CFO of Lending Club, Carrie Dolan had mentioned that they “are an innovative, data-intensive, high-growth company and needed a solution and partner that could match us. We conducted a thorough review of our options, and Oracle ERP Cloud Service was the clear winner in terms of capabilities and business value as well as commitment to us as a customer.” Read the entire release here. For now, it’s back to business as we gear up for the second half of our fiscal year and start planning for Oracle OpenWorld 2013!

    Read the article

  • How do web servers enforce the same-origin policy?

    - by BBnyc
    I'm diving deeper into developing RESTful APIs and have so far worked with a few different frameworks to achieve this. Of course I've run into the same-origin policy, and now I'm wondering how web servers (rather than web browsers) enforce it. From what I understand, some enforcing seems to happen on the browser's end (e.g., honoring a Access-Control-Allow-Origin header received from a server). But what about the server? For example, let's say a web server is hosting a Javascript web app that accesses an API, also hosted on that server. I assume that server would enforce the same-origin policy --- so that only the javascript that is hosted on that server would be allowed to access the API. This would prevent someone else from writing a javascript client for that API and hosting it on another site, right? So how would a web server be able to stop a malicious client that would try to make AJAX requests to its api endpoints while claiming to be running javascript that originated from that same web server? What's the way most popular servers (Apache, nginx) protect against this kind of attack? Or is my understanding of this somehow off the mark? Or is the cross-origin policy only enforced on the client end?

    Read the article

  • Problem with gluOrtho2D()

    - by Shashwat
    I was trying to understand the gluOrtho2D function. I have drawn 4 lines originating from the center reaching up to 4 corners of the screen. You can follow the below code. osize is a variable which is used to set the parameters of gluOrtho2D. It will create a window of size 2*osize. If works fine when osize is 1. Lines reach the corners. But as I increase the value of osize, the length of the lines decreases (cross becomes smaller and does not cover the whole screen). But I think it should reach the corner. void display() { glClear( GL_COLOR_BUFFER_BIT ); //glViewport(0, 0, 100, 100); glMatrixMode (GL_PROJECTION); float osize = 1.2; //glOrtho(-osize*1.0, osize*1.0, osize*1.0, -osize*1.0, -1.0, 1.0); gluOrtho2D(-osize*1.0, osize*1.0, osize*1.0, -osize*1.0); glMatrixMode (GL_MODELVIEW); glBegin(GL_LINES); glColor3f(0.0, 0.0, 1.0); glVertex2f(0.0, 0.0); glVertex2f(-osize*1.0, -osize*1.0); glVertex2f(0.0, 0.0); glVertex2f(-osize*1.0, osize*1.0); glVertex2f(0.0, 0.0); glVertex2f(osize*1.0, -osize*1.0); glVertex2f(0.0, 0.0); glVertex2f(osize*1.0, osize*1.0); glEnd(); glutSwapBuffers(); //includes glFlush(); } What is the problem?

    Read the article

  • ACT On' OVCA for Cloud Providers Program Launch Webcast: June 12, 2014 - 9am UKT / 10am CET / 11am EET

    - by Cinzia Mascanzoni
    Normal 0 false false false EN-US X-NONE X-NONE We invite you to join the OVCA for Cloud Providers ‘ACT On' program launch at 11am BST / 12noon CET on June 12. · More and more customers realize the value of shifting to a Converged IT Infrastructure, this is why IDC expects this market to grow 40% annually for the next 2 years. · The Oracle Virtual Compute Appliance (OVCA) with attached ZFS storage is the perfect answer to this market trend. By providing rapid application and cloud deployment, OVCA allows customers to cut capital expenditures by up to 50% and deploy key applications up to 7x faster. · For Partners, OVCA supports their journey to consolidation, virtualization and cloud, and allows them to sell higher value services to their customers. The objective of this webcast is to share with you the OVCA value proposition, help you identify the best target partners, and provide you with the Enablement and Demand Generation content and resources. To register and for further details click here /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

< Previous Page | 253 254 255 256 257 258 259 260 261 262 263 264  | Next Page >