Search Results

Search found 19551 results on 783 pages for 'enterprise library 5'.

Page 186/783 | < Previous Page | 182 183 184 185 186 187 188 189 190 191 192 193  | Next Page >

  • PARTNER WEBCASTS: EMEA ALLIANCES AND CHANNELS HARDWARE WEBINARS, JULY 2012

    - by mseika
    PARTNER WEBCASTS: EMEA ALLIANCES AND CHANNELS HARDWARE WEBINARS, JULY 2012Dear partner Oracle is pleased to invite you to the following webcasts dedicated to our EMEA partner community and designed to provide you with important news on our SPARC and Storage product portfolios. Please ensure you don't miss these unique learning opportunities! 1. How to Make Money Selling SPARC!3PM CET (2pm UKT), Tuesday, July 10, 2012 The webcast will be hosted by - Rob Ludeman, from SPARC Product Management, and Thomas Ressler, WWA&C Alliances Consultant. Agenda: To bring our partners timely, valuable information, focused on increase in their success during selling SPARC systems. The webcast will be focused and targeted on specific topics and will last approximately in 30 minutes.You can submit your questions via WebEx chat and there will be a live Q&A session at the end of the webcast. REGISTER NOW 2. Introduction to Oracle’s New StorageTek SL150 Modular Tape Library3pm CET (2pm UK), Thursday, July 12, 2012 This webcast will help you to understand Oracle's New StorageTek SL150 Modular tape library which is the first scalable tape library designed for small and midsized companies that are experiencing high growth. Built from Oracle software and StorageTek library technology, it delivers a cost-effective combination of ease of use and scalability, resulting in overall TCO savings. During the webcast Cindy McCurley, from Tape Product Management will introduce you to the latest addition to the Oracle Tape Storage product portfolio, theSL150 Modular Tape Library.This 60 minutes webcast will cover the product’s features, positioning, unique selling points and a competitive overview on StorageTek. You can submit your questions via WebEx chat and there will be a live Q&A session at the end of the webcast. REGISTER NOW Delivery FormatThis FREE online LIVE eSeminar will be delivered over the Web and Conference Call. Note: Please join the call 10 minutes before the scheduled start time. We look forward to your participation. Best regards, Giuseppe FacchettiEMEA Partner Business Development Manager, Oracle Hardware Sales Sasan MoaveniEMEA Storage Sales Manager, Oracle Hardware Sales

    Read the article

  • Need a good mp3/internet radio organization system...

    - by Zombies
    So I feel that winamp alone doesn't work well for me. I have this one playlist that all kinds of stuff gets crammed into. I need a system (this can easily be 1 program or several): Need to be able to... play mp3's from my library (just a directory structure with mp3's). Save radio station's, and easily remove/add new stations play random mp3's which are not part of my library and will probably be deleted in the future, yet won't clutter my library index!

    Read the article

  • ArchBeat Link-o-Rama Top 10 for October 28 - November 3, 2012

    - by Bob Rhubart
    The Top 10 most popular items shared on the OTN ArchBeat Facebook Page for the week of Oct 28 - Nov 3, 2012. Eventually, 90% of tech budgets will be outside IT departments | ZDNet Another interesting post from ZDNet blogger Joe McKendrick about changing roles in IT. ADF Mobile - Login Functionality | Andrejus Baranovskis "The new ADF Mobile approach with native deployment is cool when you want to access phone functionality (camera, email, sms and etc.), also when you want to build mobile applications with advanced UI," reports Oracle ACE Director Andrejus Baranovskis. Mobile Development Platform Strategy Chart: ADF Mobile, WebCenter Sites, Portal, Content and Social "Unlike desktop web focused efforts, the world of mobile has undergone change at a feverish pace," says social enterprise expert John Brunswick. His extensive post charts various resources that will help you keep up. ADF Essentials - The Bare Necessities | Floyd Teter The experiment is over… And now Oracle ACE Director Floyd Teter shares his impressions after spending some time with Oracle ADF Essentials, the free version of Oracle ADF. A review of Oracle SOA Suite 11g Administrator’s Handbook | RedStack "More so than any other single piece of content that I have seen on the topic, it provides the information that a SOA administrator needs to know in order to successfully configure, manage, monitor, troubleshoot and backup an Oracle SOA environment." So says Oracle Fusion Middleware A-Team solution architect Mark Nelson of Oracle SOA Suite 11g Administrator’s Handbook, by Ahmed Aboulnaga and Arun Pareek. Expanding the Oracle Enterprise Repository with functional documentation Capgemini middleware specialist Marc Kuijpers shares information on how Oracle Enterprise Repository can be configured "to contain functional assets, i.e. functional designs, use cases and a logical data model" to aid in SOA governance efforts. Podcast: Are You Future Proof? - Part 2 In Part 2, practicing architects and Oracle ACE Directors Ron Batra (AT&T), Basheer Khan (Innowave Technology), and Ronald van Luttikhuizen discuss re-tooling one’s skill set to reflect changes in enterprise IT, including the knowledge to steer stakeholders around the hype to what’s truly valuable. Easy way to access JPA with REST (JSON / XML) | Edwin Biemond Oracle ACE Edwin Biemond shows you "what is possible with JPA-RS, how easy it is and howto setup your own EclipseLink REST service." Clustering ODI11g for High-Availability Part 1: Introduction and Architecture | Richard Yeardley "JEE agents can be deployed alongside, or instead of, standalone agents," says Rittman Meade's Richard Yeardley. "But there is one key advantage in using JEE agents and WebLogic: when you deploy JEE agents as part of a WebLogic cluster they can be configured together to form a high availability cluster." Learn more in Yeardley's extensive post. 2012 IOUG Virtualization SIG – Online Symposium on Nov 7 and Nov 8 | Kai Yu Oracle ACE Director Kai Yu shares information on this week's IOUG Virtualization SIG online event. Does that make it a virtual virtualization event? Thought for the Day "If McDonalds were run like a software company, one out of every hundred Big Macs would give you food poisoning — and the response would be, 'We’re sorry, here’s a coupon for two more.'" — Mark Minasi Source: SoftwareQuotes.com

    Read the article

  • Is Cloud Security Holding Back Social SaaS?

    - by Mike Stiles
    The true promise of social data co-mingling with enterprise data to influence and inform social marketing (all marketing really) lives in cloud computing. The cloud brings processing power, services, speed and cost savings the likes of which few organizations could ever put into action on their own. So why wouldn’t anyone jump into SaaS (Software as a Service) with both feet? Cloud security. Being concerned about security is proper and healthy. That just means you’re a responsible operator. Whether it’s protecting your customers’ data or trying to stay off the radar of regulatory agencies, you have plenty of reasons to make sure you’re as protected from hacking, theft and loss as you can possibly be. But you also have plenty of reasons to not let security concerns freeze you in your tracks, preventing you from innovating, moving the socially-enabled enterprise forward, and keeping up with competitors who may not be as skittish regarding SaaS technology adoption. Over half of organizations are transferring sensitive or confidential data to the cloud, an increase of 10% over last year. With the roles and responsibilities of CMO’s, CIO’s and other C’s changing, the first thing you should probably determine is who should take point on analyzing cloud software options, providers, and policies. An oft-quoted Ponemon Institute study found 36% of businesses don’t have a cloud security policy at all. So that’s as good a place to start as any. What applications and data are you comfortable housing in the cloud? Do you have a classification system for data that clearly spells out where data types can go and how they can be used? Who, both internally and at the cloud provider, will function as admins? What are the different levels of admin clearance? Will your security policies and procedures sync up with those of your cloud provider? The key is verifiable trust. Trust in cloud security is actually going up. 1/3 of organizations polled say it’s the cloud provider who should be responsible for data protection. And when you look specifically at SaaS providers, that expectation goes up to 60%. 57% “strongly agree” or “agree” there’s more confidence in cloud providers’ ability to protect data. In fact, some businesses bypass the “verifiable” part of verifiable trust. Just over half have no idea what their cloud provider does to protect data. And yet, according to the “Private Cloud Vision vs. Reality” InformationWeek Report, 82% of organizations say security/data privacy are one of the main reasons they’re still holding the public cloud at arm’s length. That’s going to be a tough position to maintain, because just as social is rapidly changing the face of marketing, big data is rapidly changing the face of enterprise IT. Netflix, who’s particularly big on the benefits of the cloud, says, "We're systematically disassembling the corporate IT components." An enterprise can never realize the full power of big data, nor get the full potential value out of it, if it’s unwilling to enable the integrations and dataset connections necessary in the cloud. Because integration is called for to reduce fragmentation, a standardized platform makes a lot of sense. With multiple components crafted to work together, you’re maximizing scalability, optimization, cost effectiveness, and yes security and identity management benefits. You can see how the incentive is there for cloud companies to develop and add ever-improving security features, making cloud computing an eventual far safer bet than traditional IT. @mikestilesPhoto: stock.xchng

    Read the article

  • Live Event: OTN Architect Day: Cloud Computing - Two weeks and counting

    - by Bob Rhubart
    In just two weeks architects and others will gather at the Oracle Conference Center in Redwood Shores, CA for the first Oracle Technology Network Architect Day event of 2013. This event focuses on Cloud Computing, and features sessions specifically focused on real-world examples of the implementation of cloud computing. When: Tuesday July 9, 2013              8:30am - 12:30pm Where: Oracle Conference Center              350 Oracle Pkwy              Redwood City, CA 94065 Register now. It's free! Here's the agenda: 8:30am - 9:00am Registration and Continental Breakfast 9:00am - 9:45am Keynote 21st Century IT | Dr. James Baty VP, Global Enterprise Architecture Program, Oracle Imagine a time long, long ago. A time when servers were certified and dedicated to specific applications, when anything posted on an enterprise web site was from restricted, approved channels, and when we tried to limit the growth of 'dirty' data and storage. Today, applications are services running in the muti-tenant hybrid cloud. Companies beg their customers to tweet them, friend them, and publicly rate their products. And constantly analyzing a deluge of Internet, social and sensor data is the key to creating the next super-successful product, or capturing an evil terrorist. The old IT architecture was planned, dedicated, stable, controlled, with separate and well-defined roles. The new architecture is shared, dynamic, continuous, XaaS, DevOps. This keynote session describes the challenges and opportunities that the new business / IT paradigms present to the IT architecture and architects. 9:45am - 10:30am Technical Session Oracle Cloud: A Case Study in Building a Cloud | Anbu Krishnaswami Enterprise Architect, Oracle Building a Cloud can be challenging thanks to the complex requirements unique to Cloud computing and the massive scale typically associated with Cloud. Cloud providers can take an Infrastructure as a Service (IaaS) approach and build a cloud on virtualized commodity hardware, or they can take the Platform as a Service (PaaS) path, a service-oriented approach based on pre-configured, integrated, engineered systems. This presentation uses the Oracle Cloud itself as a case study in the use of engineered systems, demonstrating how the technical design of engineered systems is leveraged for building PaaS and SaaS Cloud services and a Cloud management infrastructure. The presentation will also explore the principles, patterns, best practices, and architecture views provided in Oracle's Cloud reference architecture. 10:30 am -10:45 am Break 10:45am-11:30am Technical Session Database as a Service | Michael Timpanaro-Perrotta Director, Product Management, Oracle Database Cloud New applications are now commonly built in a Cloud model, where the database is consumed as a service, and many established business processes are beginning to migrate to database as a service (DBaaS). This adoption of DBaaS is made possible by the availability of new capabilities in the database that enable resource pooling, dynamic resource management, model-based provisioning, metered use, and effective quality-of-service controls. This session will examine the catalog of database services at a large commercial bank to understand how these capabilities are enabling DBaaS for a wide range of needs within the enterprise. 11:30 am - 12:00 pm Panel Q&A Dr. James Baty, Anbu Krishnaswami, and Michael Timpanaro-Perrotta respond to audience questions. Registration is free, but seating is limited, so register now.

    Read the article

  • Live Event: OTN Architect Day: Cloud Computing - Two weeks and counting

    - by Bob Rhubart
    In just two weeks architects and others will gather at the Oracle Conference Center in Redwood Shores, CA for the first Oracle Technology Network Architect Day event of 2013. This event focuses on Cloud Computing, and features sessions specifically focused on real-world examples of the implementation of cloud computing. When: Tuesday July 9, 2013              8:30am - 12:30pm Where: Oracle Conference Center              350 Oracle Pkwy              Redwood City, CA 94065 Register now. It's free! Here's the agenda: 8:30am - 9:00am Registration and Continental Breakfast 9:00am - 9:45am Keynote 21st Century IT | Dr. James Baty VP, Global Enterprise Architecture Program, Oracle Imagine a time long, long ago. A time when servers were certified and dedicated to specific applications, when anything posted on an enterprise web site was from restricted, approved channels, and when we tried to limit the growth of 'dirty' data and storage. Today, applications are services running in the muti-tenant hybrid cloud. Companies beg their customers to tweet them, friend them, and publicly rate their products. And constantly analyzing a deluge of Internet, social and sensor data is the key to creating the next super-successful product, or capturing an evil terrorist. The old IT architecture was planned, dedicated, stable, controlled, with separate and well-defined roles. The new architecture is shared, dynamic, continuous, XaaS, DevOps. This keynote session describes the challenges and opportunities that the new business / IT paradigms present to the IT architecture and architects. 9:45am - 10:30am Technical Session Oracle Cloud: A Case Study in Building a Cloud | Anbu Krishnaswami Enterprise Architect, Oracle Building a Cloud can be challenging thanks to the complex requirements unique to Cloud computing and the massive scale typically associated with Cloud. Cloud providers can take an Infrastructure as a Service (IaaS) approach and build a cloud on virtualized commodity hardware, or they can take the Platform as a Service (PaaS) path, a service-oriented approach based on pre-configured, integrated, engineered systems. This presentation uses the Oracle Cloud itself as a case study in the use of engineered systems, demonstrating how the technical design of engineered systems is leveraged for building PaaS and SaaS Cloud services and a Cloud management infrastructure. The presentation will also explore the principles, patterns, best practices, and architecture views provided in Oracle's Cloud reference architecture. 10:30 am -10:45 am Break 10:45am-11:30am Technical Session Database as a Service | Michael Timpanaro-Perrotta Director, Product Management, Oracle Database Cloud New applications are now commonly built in a Cloud model, where the database is consumed as a service, and many established business processes are beginning to migrate to database as a service (DBaaS). This adoption of DBaaS is made possible by the availability of new capabilities in the database that enable resource pooling, dynamic resource management, model-based provisioning, metered use, and effective quality-of-service controls. This session will examine the catalog of database services at a large commercial bank to understand how these capabilities are enabling DBaaS for a wide range of needs within the enterprise. 11:30 am - 12:00 pm Panel Q&A Dr. James Baty, Anbu Krishnaswami, and Michael Timpanaro-Perrotta respond to audience questions. Registration is free, but seating is limited, so register now.

    Read the article

  • FreeType2 Crash on FT_Init_FreeType

    - by JoeyDewd
    I'm currently trying to learn how to use the FreeType2 library for drawing fonts with OpenGL. However, when I start the program it immediately crashes with the following error: "(Can't correctly start the application (0xc000007b))" Commenting the FT_Init_FreeType removes the error and my game starts just fine. I'm wondering if it's my code or has something to do with loading the dll file. My code: #include "SpaceGame.h" #include <ft2build.h> #include FT_FREETYPE_H //Freetype test FT_Library library; Game::Game(int Width, int Height) { //Freetype FT_Error error = FT_Init_FreeType(&library); if(error) { cout << "Error occured during FT initialisation" << endl; } And my current use of the FreeType2 files. Inside my bin folder (where debug .exe is located) is: freetype6.dll, libfreetype.dll.a, libfreetype-6.dll. In Code::Blocks, I've linked to the lib and include folder of the FreeType 2.3.5.1 version. And included a compiler flag: -lfreetype My program starts perfectly fine if I comment out the FT_Init function which means the includes, and library files should be fine. I can't find a solution to my problem and google isn't helping me so any help would be greatly appreciated.

    Read the article

  • How do you keep from running into the same problems over and over?

    - by Stephen Furlani
    I keep running into the same problems. The problem is irrelevant, but the fact that I keep running into is completely frustrating. The problem only happens once every, 3-6 months or so as I stub out a new iteration of the project. I keep a journal every time, but I spend at least a day or two each iteration trying to get the issue resolved. How do you guys keep from making the same mistakes over and over? I've tried a journal but it apparently doesn't work for me. [Edit] A few more details about the issue: Each time I make a new project to hold the files, I import a particular library. The library is a C++ library which imports glew.h and glx.h GLX redefines BOOL and that's not kosher since BOOL is a keyword for ObjC. I had a fix the last time I went through this. I #ifndef the header in the library to exclude GLEW and GLX and everything worked hunky-dory. This time, however, I do the same thing, use the same #ifndef block but now it throws a bunch of errors. I go back to the old project, and it works. New project no-worky. It seems like it does this every time, and my solution to it is new each time for some reason. I know #defines and #includes are one of the trickiest areas of C++ (and cross-language with Objective-C), but I had this working and now it's not.

    Read the article

  • Oracle Database Upcoming Event dates to know

    - by mandy.ho
    February may be a short month, but it's not short of exciting Oracle events. From information packed "Real Performance Days" to participation in one of the biggest IT Security events - look out for Oracle Database and let us know if you are there with us! Feb 13-18, 2011 - Las Vegas, NV TDWI World Conference Series Join Oracle in highlighting Exadata x2-2 and x2-8, along with Oracle Business Intelligence, Enterprise Performance management and Data Warehousing solutions. Oracle will be presenting a workshop - Oracle Data Integration: Best-of-Breed Solutions for the Enterprise Wednesday, February 16, 2011 7p.m - 9p.m Glen Goodrich, Director of Product Management Christophe Dupupet, Director of Product Management, Data Integration http://events.tdwi.org/events/las-vegas-world-conference-2011/sessions/session-list.aspx Feb 14-17, 2011 - Barcelona, Spain Mobile World Congress MWC is an event where Oracle showcases the near complete breadth and depth of value that our Communications Industry strategy and Hardware and Software Solutions can deliver. Oracle supports Communications Service Providers today and delivers platforms and flexibility primed for the future. Oracle will have a two story Pavilion, along with an Oracle Java and Embedded Solutions Center - App Planet. The Exhibition times are Monday, 14th February 09.00 - 19.00 Tuesday, 15th February 09.00 - 19.00 Wednesday, 16th February 09.00 - 19.00 Thursday, 17th February 09.00 - 16.00 Have questions? Meet with Oracle Sales representatives at the Oracle Café. Open every day from 9am to 17:00pm. http://eventreg.oracle.com/webapps/events/ns/EventsDetail.jsp?p_eventId=109912&src=6973382&src=6973382&Act=4 Feb 14-18, 2011 - San Francisco, CA RSA Conference As the world's most complete, open, integrated business software and hardware systems provider, Oracle can uniquely safeguard your information throughout its entire lifecycle. Learn more by attending these sessions: Cloud Computing: A Brave New World for Security and Privacy (CLD-201) Wednesday, February 16 at 8:30 a.m. Databases Under Attack - Securing Heterogeneous Database Infrastructures (DAS-301) Thursday, February 17, 2011 at 8:30 a.m. Seven Steps to Protecting Databases (DAS-402) Friday, February 18 at 10:10 a.m. RSA Conference Attendees will also have the opportunity to meet with Oracle Security Solution experts, see live product demos and more by visiting booth # 1559. Hours: Monday, February 14, 6:00 p.m. - 8:00 p.m., Tuesday, February 15, 11:00 a.m. - 6:00 p.m. and 4:30 p.m. - 6:00p.m., Wednesday, February 16, 11:00 a.m. - 6:00 p.m., and Thursday, February 17, 11:00 a.m. - 3:00 p.m. http://eventreg.oracle.com/webapps/events/ns/EventsDetail.jsp?p_eventId=127657&src=6967733&src=6967733&Act=12 Feb 21-25, 2011 - Various Locations IOUG Presents - A Day of Real World Performance with Tom Kyte, Andrew Holdsworth and Graham Wood These Oracle experts will debate, discuss and delineate the best practices for designing hardware architectures, deploying Oracle databases, and developing applications that deliver the fastest possible performance for your business.Topics are covered in a conversational format - with all three chiming in where appropriate. Each presenter has their own screen projector to demonstrate their individual points to the participants. Customers will have the opportunity to get their specific performance/tuning questions answered and learn how to balance all the different environmental requirements for their applications to improve performance. Register today for the following dates and locations • February 21 in San Diego, CA • February 22 in Los Angeles, CA • February 23 in Seattle, WA • February 25 in Phoenix, AZ http://www.ioug.org/tabid/194/Default.aspx Feb 8-24 - Various Oracle Enterprise Cloud Summit This series of full-day events with cloud experts, sharing real-world best practices, reference architectures and more continues during the month of February. Attend the Oracle Enterprise Cloud Summit to learn how to: • Build a state-of-the-art cloud architecture • Leverage your existing IT investments • Optimize your IT management processes Whether you are considering a move to cloud computing or have already adopted a cloud model, this event offers you the insights you need to take full advantage of cloud computing. Check below to see if the event is coming to a city near you. http://www.oracle.com/us/corporate/events/cloud-events-214342.html

    Read the article

  • CodePlex Daily Summary for Wednesday, November 07, 2012

    CodePlex Daily Summary for Wednesday, November 07, 2012Popular ReleasesMetodología General Ajustada - MGA: 03.04.03: Cambios Parmenio: Ajustes al formato F02 de programación para que la sincronización de las grillas no afecte el guardado de los datos. Cambios John: Integración de código con cambios enviados por Parmenio. Generación de instaladores. Soporte técnico por correo electrónico, telefónico y en sitio.nAPI for Windows Phone: Naver Open API Library Assemblies and Source Codes: nAPI (Naver Open API Library) for Windows Family Tested on - Windows 8 (Windows Store App) - Windows Phone 7 - Windows Phone 8 (Emulator)X-tee.NET: Xtee.NET 1.0: Generaator ja teegidFiskalizacija za developere: FiskalizacijaDev 1.2: Verzija 1.2. je, prije svega, odgovor na novu verziju Tehnicke specifikacije (v1.1.) koja je objavljena prije nekoliko dana. Pored novosti vezanih uz (sitne) izmjene u spomenutoj novoj verziji Tehnicke dokumentacije, projekt smo prošili sa nekim dodatnim feature-ima od kojih je vecina proizašla iz vaših prijedloga - hvala :) Novosti u v1.2. su: - Neusuglašenost zahtjeva (http://fiskalizacija.codeplex.com/workitem/645) - Sample projekt - iznosi se množe sa 100 (http://fiskalizacija.codeplex.c...PowerComboBox: PowerComboBox VB v1.0: Visual Basic source code class file.Edi: Themable Edi: Completed ExpressionDark theme Improved Error Handling and Reporting feature Refactored all views to be look-less controlsMFCMAPI: October 2012 Release: Build: 15.0.0.1036 Full release notes at SGriffin's blog. If you just want to run the MFCMAPI or MrMAPI, get the executables. If you want to debug them, get the symbol files and the source. The 64 bit builds will only work on a machine with Outlook 2010 64 bit installed. All other machines should use the 32 bit builds, regardless of the operating system. Facebook BadgeJayData - The cross-platform HTML5 data-management library for JavaScript: JayData 1.2.3: JayData is a unified data access library for JavaScript to CRUD + Query data from different sources like OData, MongoDB, WebSQL, SqLite, HTML5 localStorage, Facebook or YQL. The library can be integrated with Knockout.js or Sencha Touch 2 and can be used on Node.js as well. See it in action in this 6 minutes video Sencha Touch 2 example app using JayData: Netflix browser. What's new in JayData 1.2.3 For detailed release notes check the release notes. TypeScript supportWrite your code in a ...SSIS Expression Editor & Tester: Expression Editor and Tester v1.0.8.0: Getting Started Download and extract the files, no install required. The ExpressionEditor.zip download contains a folder for each SQL Server version. ExpressionEditor2005 ExpressionEditor2008 ExpressionEditor2012 Changes Fixed issues 32868 and 33291 raised by BIDS Helper users. No functional changes from previous release. Versions There are three versions included, all built from the same code with the same functionality, but each targeting a different release of SQL Server. The downlo...MCEBuddy 2.x: MCEBuddy 2.3.7: Changelog for 2.3.7 (32bit and 64bit) 1. Improved performance of MP4 Fast and M4V Fast Profiles (no deinterlacing, removed --decomb) 2. Improved priority handling 3. Added support for Pausing and Resume conversions 4. Added support for fallback to source directory if network destination directory is unavailable 5. MCEBuddy now installs ShowAnalyzer during installation 6. Added support for long description atom in iTunesFoxyXLS: FoxyXLS Releases: Source code and samplesDyanamic Reports (RDLC) - SharePoint 2010 Visual WebPart: Initial Release: This is a Initial Release.HTML Renderer: HTML Renderer 1.0.0.0 (3): Major performance improvement (http://theartofdev.wordpress.com/2012/10/25/how-i-optimized-html-renderer-and-fell-in-love-with-vs-profiler/) Minor fixes raised in issue tracker and discussions.ProDinner - ASP.NET MVC Sample (EF4.4, N-Tier, jQuery): 8: update to ASP.net MVC Awesome 3.0 udpate to EntityFramework 4.4 update to MVC 4 added dinners grid on homepageASP.net MVC Awesome - jQuery Ajax Helpers: 3.0: added Grid helper added XML Documentation added textbox helper added Client Side API for AjaxList removed .SearchButton from AjaxList AjaxForm and Confirm helpers have been merged into the Form helper optimized html output for AjaxDropdown, AjaxList, Autocomplete works on MVC 3 and 4BlogEngine.NET: BlogEngine.NET 2.7: Cheap ASP.NET Hosting - $4.95/Month - Click Here!! Click Here for More Info Cheap ASP.NET Hosting - $4.95/Month - Click Here! If you want to set up and start using BlogEngine.NET right away, you should download the Web project. If you want to extend or modify BlogEngine.NET, you should download the source code. If you are upgrading from a previous version of BlogEngine.NET, please take a look at the Upgrading to BlogEngine.NET 2.7 instructions. If you looking for Web Application Project, ...Launchbar: Launchbar 4.2.2.0: This release is the first step in cleaning up the code and using all the latest features of .NET 4.5 Changes 4.2.2 (2012-11-02) Improved handling of left clicks 4.1.0 (2012-10-17) Removed tray icon Assembly renamed and signed with strong name Note When you upgrade, Launchbar will start with the default settings. You can import your previous settings by following these steps: Run Launchbar and just save the settings without configuring anything Shutdown Launchbar Go to the folder %LOCA...Mouse Jiggler: MouseJiggle-1.3: This adds the much-requested minimize-to-tray feature to Mouse Jiggler.Umbraco CMS: Umbraco 4.10.0 Release Candidate: This is a Release Candidate, which means that if we do not find any major issues in the next week, we will release this version as the final release of 4.10.0 on November 9th, 2012. The documentation for the MVC bits still lives in the Github version of the docs for now and will be updated on our.umbraco.org with the final release of 4.10.0. Browse the documentation here: https://github.com/umbraco/Umbraco4Docs/tree/4.8.0/Documentation/Reference/Mvc If you want to do only MVC then make sur...Skype Auto Recorder: SkypeAutoRecorder 1.3.4: New icon and images. Reworked settings window. Implemented high-quality sound encoding. Implemented a possibility to produce stereo records. Added buttons with system-wide hot keys for manual starting and canceling of recording. Added buttons for opening folder with records. Added Help button. Fixed an issue when recording is continuing after call end. Fixed an issue when recording doesn't start. Fixed several bugs and improved stability. Major refactoring and optimization...New Projects"On the Fly Zip and Attach" Windows Live Writer Plugin: This is a windows live writer zipping plug-in that allows you to select files/folders and zip them on the fly that will appear as attachment inserts, while you are writing blogs. Find details @ my Blogs Site: [url: Blogs: http://www.geekscafe.net|http://www.geekscafe.net].NET Micro Framework Tools: Collection of tools usefull for creating Netduino and .NET Micro Framework applications.Aktina: Aktina is a game engine written in DirectX 11.AppHub: AppHub??????AppStore???,???????,?????????,?????????,????????????,??,???????,???????????、??、??、??????????????,???????。 ??????????,????。Calcula Calles: Calcula CallesCRM 2011 ASP.NET Membership Provider: The CRM 2011 ASP.NET Membership provider Contains a Membership and Role provider, which can be used in ASP.NET Applications and/or ASP.NET based CMS systems.CSharp Executor: Dynamically execute C# script files in the same way that you might use VB scripts.Deque (by Stephen Cleary): A simple double-ended queue (deque) in C#. Unit tested.DirST: Allows one to replicate a DIRectory STructure without copying files. Written in C#.DNNTaskManager69: Testing DNN Module DevelopmentEchelon OS: an Sister Project to Aza DOS and this one is meant to be as minimilistic as it can with a filesystem and text editorGeek Reader: Lector de noticias para windows 8. Se trata de un template que permite rápidamente construir una aplicación windows 8 store GIII_P2: projekt 2katas-gpa: Codigo de los coding dojo acerca de patrones de diseñoKendoUI: Demonstrations using Kendo UI Complete for ASP.NET MVCLTorrent: C# BitTorrent and peer-to-peer protocol implementation MECopter: A real summary will follow soon...Migrate AD Group Permissions in Sharepoint/WSS: Sharepoint AD Group Migration tool, allows conversion of AD security principals used in Sharepoint ACLs from source domain to destination domainNetSend Manager: NetSend Manager is an Asp.Net console which enables you to send windows popup messages over a domain network. Messages can be send to a single user or to a grouNONAME0: ?? ? ???????-?????? ??? ?????? ???? ?????? ???? ??? ?? ???????????????? ??? ? ?????? ??? ??????????npr2012: Projekt testowy Personal News Assistant: Vision: The Personal News Assistant is an application which collects information from different sources and informs the user either on demand or preventive.PGIrony - Tookit & Examples for AST Generation with Irony: A tool-kit to ease AST generation,, and further ease grammr construction, with Irony.Proboscis: A query provider to work against HBase. Will be developed against HBase on Windows Azure.Project13251106: papaProject13271106: sdgProjet IMA: rtjRenderCraft: This is an editor like a AMD's RenderMonkey. But this supports Direct3D 11.Rx (Reactive Extensions): The Reactive Extensions (Rx) is a library for composing asynchronous and event-based programs using observable sequences and LINQ-style query operators. Sanle: sanle project with c++Screener: Screen sharing software.SemantEx: Adds a validation wrapper around regular expressions, allowing you to automatically apply conditional logic to capture groups.SharePoint CAML Extensions: SharePoint CAML ExtensionsSilentPlace: Turn on silence mode when you are in a silence placeSmartMacros: SmartMacros allows developers to define macros in C# and use them inside other source code. These macros are much stronger and safer than C/C++ macros, therefore they're called "Smart" :-)SoundArea link grabber: Soundarea link grabber the script put a list of links in your clipboard.Time Tracker Kickstart: This project is currently a work-in-progress.TransparentImage: TransparentImage is a console application that converts BMPS into PNG files. TransparentImage is written in VB and supports drop n drag.Travis7: A Travis-CI client for Windows PhoneWindows 8 Accelerator: A set of components and controls to accelerate your Windows 8 and Windows Phone 8 application development.Windows and Windows Phone DNS Library: Simple DNS lookup library for Windows and Windows Phone applicationsWinJS Toolkit - JavaScript Toolkit for Windows 8: The WinJS Toolkit is a set of classes, helper functions and tools that help creating Windows Store applications in HTML5, CSS3 and JavaScript.WPF TB: Study WPF????????: ZJU software project homework,One part of the total stocktrade system.

    Read the article

  • The Stub Proto: Not Just For Stub Objects Anymore

    - by user9154181
    One of the great pleasures of programming is to invent something for a narrow purpose, and then to realize that it is a general solution to a broader problem. In hindsight, these things seem perfectly natural and obvious. The stub proto area used to build the core Solaris consolidation has turned out to be one of those things. As discussed in an earlier article, the stub proto area was invented as part of the effort to use stub objects to build the core ON consolidation. Its purpose was merely as a place to hold stub objects. However, we keep finding other uses for it. It turns out that the stub proto should be more properly thought of as an auxiliary place to put things that we would like to put into the proto to help us build the product, but which we do not wish to package or deliver to the end user. Stub objects are one example, but private lint libraries, header files, archives, and relocatable objects, are all examples of things that might profitably go into the stub proto. Without a stub proto, these items were handled in a variety of ad hoc ways: If one part of the workspace needed private header files, libraries, or other such items, it might modify its Makefile to reach up and over to the place in the workspace where those things live and use them from there. There are several problems with this: Each component invents its own approach, meaning that programmers maintaining the system have to invest extra effort to understand what things mean. In the past, this has created makefile ghettos in which only the person who wrote the makefiles feels confident to modify them, while everyone else ignores them. This causes many difficulties and benefits no one. These interdependencies are not obvious to the make, utility, and can lead to races. They are not obvious to the human reader, who may therefore not realize that they exist, and break them. Our policy in ON is not to deliver files into the proto unless those files are intended to be packaged and delivered to the end user. However, sometimes non-shipping files were copied into the proto anyway, causing a different set of problems: It requires a long list of exceptions to silence our normal unused proto item error checking. In the past, we have accidentally shipped files that we did not intend to deliver to the end user. Mixing cruft with valuable items makes it hard to discern which is which. The stub proto area offers a convenient and robust solution. Files needed to build the workspace that are not delivered to the end user can instead be installed into the stub proto. No special exceptions or custom make rules are needed, and the intent is always clear. We are already accessing some private lint libraries and compilation symlinks in this manner. Ultimately, I'd like to see all of the files in the proto that have a packaging exception delivered to the stub proto instead, and for the elimination of all existing special case makefile rules. This would include shared objects, header files, and lint libraries. I don't expect this to happen overnight — it will be a long term case by case project, but the overall trend is clear. The Stub Proto, -z assert_deflib, And The End Of Accidental System Object Linking We recently used the stub proto to solve an annoying build issue that goes back to the earliest days of Solaris: How to ensure that we're linking to the OS bits we're building instead of to those from the running system. The Solaris product is made up of objects and files from a number of different consolidations, each of which is built separately from the others from an independent code base called a gate. The core Solaris OS consolidation is ON, which stands for "Operating System and Networking". You will frequently also see ON called the OSnet. There are consolidations for X11 graphics, the desktop environment, open source utilities, compilers and development tools, and many others. The collection of consolidations that make up Solaris is known as the "Wad Of Stuff", usually referred to simply as the WOS. None of these consolidations is self contained. Even the core ON consolidation has some dependencies on libraries that come from other consolidations. The build server used to build the OSnet must be running a relatively recent version of Solaris, which means that its objects will be very similar to the new ones being built. However, it is necessarily true that the build system objects will always be a little behind, and that incompatible differences may exist. The objects built by the OSnet link to other objects. Some of these dependencies come from the OSnet, while others come from other consolidations. The objects from other consolidations are provided by the standard library directories on the build system (/lib, /usr/lib). The objects from the OSnet itself are supposed to come from the proto areas in the workspace, and not from the build server. In order to achieve this, we make use of the -L command line option to the link-editor. The link-editor finds dependencies by looking in the directories specified by the caller using the -L command line option. If the desired dependency is not found in one of these locations, ld will then fall back to looking at the default locations (/lib, /usr/lib). In order to use OSnet objects from the workspace instead of the system, while still accessing non-OSnet objects from the system, our Makefiles set -L link-editor options that point at the workspace proto areas. In general, this works well and dependencies are found in the right places. However, there have always been failures: Building objects in the wrong order might mean that an OSnet dependency hasn't been built before an object that needs it. If so, the dependency will not be seen in the proto, and the link-editor will silently fall back to the one on the build server. Errors in the makefiles can wipe out the -L options that our top level makefiles establish to cause ld to look at the workspace proto first. In this case, all objects will be found on the build server. These failures were rarely if ever caught. As I mentioned earlier, the objects on the build server are generally quite close to the objects built in the workspace. If they offer compatible linking interfaces, then the objects that link to them will behave properly, and no issue will ever be seen. However, if they do not offer compatible linking interfaces, the failure modes can be puzzling and hard to pin down. Either way, there won't be a compile-time warning or error. The advent of the stub proto eliminated the first type of failure. With stub objects, there is no dependency ordering, and the necessary stub object dependency will always be in place for any OSnet object that needs it. However, makefile errors do still occur, and so, the second form of error was still possible. While working on the stub object project, we realized that the stub proto was also the key to solving the second form of failure caused by makefile errors: Due to the way we set the -L options to point at our workspace proto areas, any valid object from the OSnet should be found via a path specified by -L, and not from the default locations (/lib, /usr/lib). Any OSnet object found via the default locations means that we've linked to the build server, which is an error we'd like to catch. Non-OSnet objects don't exist in the proto areas, and so are found via the default paths. However, if we were to create a symlink in the stub proto pointing at each non-OSnet dependency that we require, then the non-OSnet objects would also be found via the paths specified by -L, and not from the link-editor defaults. Given the above, we should not find any dependency objects from the link-editor defaults. Any dependency found via the link-editor defaults means that we have a Makefile error, and that we are linking to the build server inappropriately. All we need to make use of this fact is a linker option to produce a warning when it happens. Although warnings are nice, we in the OSnet have a zero tolerance policy for build noise. The -z fatal-warnings option that was recently introduced with -z guidance can be used to turn the warnings into fatal build errors, forcing the programmer to fix them. This was too easy to resist. I integrated 7021198 ld option to warn when link accesses a library via default path PSARC/2011/068 ld -z assert-deflib option into snv_161 (February 2011), shortly after the stub proto was introduced into ON. This putback introduced the -z assert-deflib option to the link-editor: -z assert-deflib=[libname] Enables warning messages for libraries specified with the -l command line option that are found by examining the default search paths provided by the link-editor. If a libname value is provided, the default library warning feature is enabled, and the specified library is added to a list of libraries for which no warnings will be issued. Multiple -z assert-deflib options can be specified in order to specify multiple libraries for which warnings should not be issued. The libname value should be the name of the library file, as found by the link-editor, without any path components. For example, the following enables default library warnings, and excludes the standard C library. ld ... -z assert-deflib=libc.so ... -z assert-deflib is a specialized option, primarily of interest in build environments where multiple objects with the same name exist and tight control over the library used is required. If is not intended for general use. Note that the definition of -z assert-deflib allows for exceptions to be specified as arguments to the option. In general, the idea of using a symlink from the stub proto is superior because it does not clutter up the link command with a long list of objects. When building the OSnet, we usually use the plain from of -z deflib, and make symlinks for the non-OSnet dependencies. The exception to this are dependencies supplied by the compiler itself, which are usually found at whatever arbitrary location the compiler happens to be installed at. To handle these special cases, the command line version works better. Following the integration of the link-editor change, I made use of -z assert-deflib in OSnet builds with 7021896 Prevent OSnet from accidentally linking to build system which integrated into snv_162 (March 2011). Turning on -z assert-deflib exposed between 10 and 20 existing errors in our Makefiles, which were all fixed in the same putback. The errors we found in our Makefiles underscore how difficult they can be prevent without an automatic system in place to catch them. Conclusions The stub proto is proving to be a generally useful construct for ON builds that goes beyond serving as a place to hold stub objects. Although invented to hold stub objects, it has already allowed us to simplify a number of previously difficult situations in our makefiles and builds. I expect that we'll find uses for it beyond those described here as we go forward.

    Read the article

  • Insurers Pushed to Transform Their Business

    - by Calvin Glenn
    Everyone in the P&C industry has heard it “We can’t do it.” “Nobody wants to do it.” “We can’t afford to do it.”  Unfortunately, what they’re referencing are the reasons many insurers are still trying to maintain their business processing on legacy policy administration systems, attempting to bide time until there is no other recourse but to give in, bite the bullet, and take on the monumental task of replacing an entire policy administration system (PAS). Just the thought of that project sends IT, Business Users and Management reeling. However, is that fear real?  It is a bit daunting when one realizes that a complete policy administration system replacement will touch most every function an insurer manages, from quoting and rating, to underwriting, distribution, and even customer service. With that, everyone has heard at least one horror story around a transformation initiative that has far exceeded budget and the promised implementation / go-live timeline.    But, does it have to be that hard?  Surely, in the age where a person can voice-activate their DVR to record a TV program from a cell phone, there has to be someone somewhere who’s figured out how to simplify this process. To be able to help insurers, of all sizes, transform and grow their business while also delivering on their overall objectives of providing speed to market, straight-through-processing for applications, quoting, underwriting, and simplified product development. Maybe we’re looking too hard and the answer is simple and straight-forward. Why replace the entire machine when all it really needs is a new part…a single enterprise rating system? This core, modular piece of the policy administration system is the foundation of product development and rate management that enables insurers to provide the right product at the right price to the right customer through the best channels at any given moment in time. The real benefit of a single enterprise rating system is the ability to deliver enhanced business capabilities, such as improved product management, streamlined underwriting, and speed to market. With these benefits, carriers have accomplished a portion of their overall transformation goal. Furthermore, lessons learned from the rating project can be applied to the bigger, down-the-road PAS project to support the successful completion of the overall transformation endeavor. At the recent Oracle OpenWorld Conference in San Francisco, information was shared with attendees about a recent “go-live” project from an Oracle Insurance Tier 1 insurer who did what is proposed above…replaced just the rating portion of their legacy policy administration system with Oracle Insurance Insbridge Rating and Underwriting.  This change provided the insurer greater flexibility to set rates that better reflect risk while enabling the company to support its market segment strategy. Using the Oracle Insurance Insbridge enterprise rating solution, the insurer was able to reduce processing time for agents and underwriters, gained the ability to support proprietary rating models and improved pricing accuracy.      There is mounting pressure on P&C insurers to produce growth and show net profitability in the midst of modest overall industry growth, large weather-related losses and intensifying competition for market share.  Insurers are also being asked to improve customer service, offer a differentiated value proposition and simplify insurance processes.  While the demands are many there is an easy answer…invest in and update the most mission critical application in your arsenal, the single enterprise rating system. Download the Podcast to listen to “Stand-Alone Rating Engine - Leading Force Behind Core Transformation Projects in the P&C Market,” a podcast originally recorded in October 2013. Related Resources: White Paper: Stand-Alone Rating Engine: Leading Force Behind Core Transformation Projects in the P&C Market Webcast On Demand: Stand-Alone Rating Engine and Core Transformation for P&C Insurers Don’t forget to keep up with us year-round: Facebook: www.facebook.com/oracleinsurance Twitter: www.twitter.com/oracleinsurance YouTube: www.youtube.com/oracleinsurance

    Read the article

  • Continuous integration never results in build errors

    - by Jon
    Hi, I'm working with a variety of Java EE websites which use internal libraries we've developed. For each website, we only upgrade to new versions of our internal libraries as needed, and before committing we make sure that the site compiles fine. What this means is that when TeamCity does a build of one of our sites, the site compiles fine, but later when the site is updated to the latest version of internal libraries, there might be a compile error. Is there a good way to handle this? We're not using Maven yet; would using Maven mean that our websites could automatically use the latest version of internal libraries? Thanks. Clarification: What we sometimes run into is this: Project A depends on a library, and is currently using library version 1.0 Project B also depends on that library. I make changes to the library so that it is now version 1.5. Project B now uses 1.5. Project A and project B have both been built just fine by the CI server (TeamCity) Working on project A again, I update to 1.5 and discover that 1.5 has breaking changes in it. Is there a way for the CI server to discover these kinds of breaking changes?

    Read the article

  • Security and the Mobile Workforce

    - by tobyehatch
    Now that many organizations are moving to the BYOD philosophy (bring your own devices), security for phones and tablets accessing company sensitive information is of paramount importance. I had the pleasure to interview Brian MacDonald, Principal Product Manager for Oracle Business Intelligence (BI) Mobile Products, about this subject, and he shared some wonderful insight about how the Oracle Mobile Security Tool Kit is addressing mobile security and doing some pretty cool things.  With the rapid proliferation of phones and tablets, there is a perception that mobile devices are a security threat to corporate IT, that mobile operating systems are not secure, and that there are simply too many ways to inadvertently provide access to critical analytic data outside the firewall. Every day, I see employees working on mobile devices at the airport, while waiting for their airplanes, and using public WIFI connections at coffee houses and in restaurants. These methods are not typically secure ways to access confidential company data. I asked Brian to explain why. “The native controls for mobile devices and applications are indeed insufficiently secure for corporate deployments of Business Intelligence and most certainly for businesses where data is extremely critical - such as financial services or defense - although it really applies across the board. The traditional approach for accessing data from outside a firewall is using a VPN connection which is not a viable solution for mobile. The problem is that once you open up a VPN connection on your phone or tablet, you are creating an opening for the whole device, for all the software and installed applications. Often the VPN connection by itself provides insufficient encryption – if any – which means that data can be potentially intercepted.” For this reason, most organizations that deploy Business Intelligence data via mobile devices will only do so with some additional level of control. So, how has the industry responded? What are companies doing to address this very real threat? Brian explained that “Mobile Device Management (MDM) and Mobile Application Management (MAM) software vendors have rapidly created solutions for mobile devices that provide a vast array of services for controlling, managing and establishing enterprise mobile usage policies. On the device front, vendors now support full levels of encryption behind the firewall, encrypted local data storage, credential management such as federated single-sign-on as well as remote wipe, geo-fencing and other risk reducing features (should a device be lost or stolen). More importantly, these software vendors have created methods for providing these capabilities on a per application basis, allowing for complete isolation of the application from the mobile operating system. Finally, there are tools which allow the applications themselves to be distributed through enterprise application stores allowing IT organizations to manage who has access to the apps, when updates to the applications will happen, and revoke access after an employee leaves. So even though an employee may be using a personal device, access to company data can be controlled while on or near the company premises. So do the Oracle BI mobile products integrate with the MDM and MAM vendors? Brian explained that our customers use a wide variety of mobile security vendors and may even have more than one in-house. Therefore, Oracle is ensuring that users have a choice and a mechanism for linking together Oracle’s BI offering with their chosen vendor’s secure technology. The Oracle BI Mobile Security Toolkit, which is a version of the Oracle BI Mobile HD application, delivered through the Oracle Technology Network (OTN) in its component parts, helps Oracle users to build their own version of the Mobile HD application, sign it with their own enterprise development certificates, link with their security vendor of choice, then deploy the combined application through whichever means they feel most appropriate, including enterprise application stores.  Brian further explained that Oracle currently supports most of the major mobile security vendors, has close relationships with each, and maintains strong partnerships enabling both Oracle and the vendors to test, update and release a cooperating solution in lock-step. Oracle also ensures that as new versions of the Oracle HD application are made available on the Apple iTunes store, the same version is also immediately made available through the Security Toolkit on OTN.  Rest assured that as our workforce continues down the mobile path, company sensitive information can be secured.  To listen to the entire podcast, click here. To learn more about the Oracle BI Mobile HD, click  here To learn more about the BI Mobile Security Toolkit, click here 

    Read the article

  • Are Chief Digital Officers the Result of CMO/CIO Refusal to Change?

    - by Mike Stiles
    Apparently CDO no longer just stands for “Collateralized Debt Obligations.”  It stands for Chief Digital Officer. And they’re the ones who are supposed to answer the bat signal CEO’s are throwing into the sky, swoop in and POW! drive the transition of the enterprise to integrated digital systems. So imagine being a CMO or a CIO at such an enterprise and realizing it’s been determined that you are not the answer that’s needed. In fact, IntelligentHQ author Ashley Friedlein points out the very rise of the CDO is an admission of C-Suite failure to become savvy enough, quickly enough in modern technology. Is that fair? Despite the repeated drumbeat that CMO’s and CIO’s must enter a new era of cooperation and collaboration to enact the social-enabled enterprise, the verdict seems to be that if it’s happening at all, it’s not happening fast enough. Therefore, someone else is needed with the authority to make things happen. So who is this relatively new beast? Gartner VP David Willis says, “The Chief Digital Officer plays in the place where the enterprise meets the customer, where the revenue is generated, and the mission accomplished.” In other words, where the rubber meets the road. They aren’t just another “C” heading up a unit. They’re the CEO’s personal SWAT team, able to call the shots necessary across all units to affect what has become job one…customer experience. And what are the CMO’s and CIO’s doing while this is going on? Playing corporate games. Accenture reports 38% of CMOs say IT deliberately keeps them out of the loop, with 35% saying marketing’s needs aren’t a very high priority. 31% of CIOs say marketers don’t understand tech and regularly go around them for solutions. Fun! Meanwhile the CEO feels the need to bring in a parental figure to pull it all together. Gartner thinks 25% of all orgs will have a CDO by 2015 as CMO’s and particularly CIO’s (Peter Hinssen points out many CDO’s are coming “from anywhere but IT”) let the opportunity to be the agent of change their company needs slip away. Perhaps most interestingly, these CDO’s seem to be entering the picture already on the fast track. One consultancy counted 7 instances of a CDO moving into the CEO role, which, as this Wired article points out, is pretty astounding since nobody ever heard of the job a few years ago. And vendors are quickly figuring out that this is the person they need to be talking to inside the brand. The position isn’t without its critics. Forrester’s Martin Gill says the reaction from executives at some traditional companies to someone being brought in to be in charge of digital might be to wash their own hands of responsibility for all things digital – a risky maneuver given the pervasiveness of digital in business. They might not even be called Chief Digital Officers. They might be the Chief Customer Officer, Chief Experience Officer, etc. You can call them Twinkletoes if you want to, but essentially anyone who has the mandate direct from the CEO to enact modern technology changes not currently being championed by the CMO or CIO can be regarded as “boss.” @mikestiles @oraclesocialPhoto: freedigitalphotos.net

    Read the article

  • Chalk Talk with John: What Does User Experience Mean to You?

    - by Tanu Sood
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Author: John Brunswick The "Chalk Talk with John" series will explore the practical value of Middleware in the context of two fictional communities, shared through analogies aligned to enterprise technology.  This format offers business stakeholders and IT a common language for understanding the benefits of technology in support of their business initiatives, regardless of their current level of technical knowledge. I will endeavor to showcase an episode highlighting business use cases and how technology plays a role in business on a bi-weekly basis. The debut episode highlights the benefits of user experience capabilities supplied by Portal technologies, by juxtaposing the communities of Middleware Fields and Codeaway Valley with regard to the time and effort their residents spend performing everyday tasks.  This comparison provides insight into the benefits of leveraging a common user experience foundation to support the tasks that our employees, customers and partners engage in on a daily basis with our organizations. Take a look and let me know your thoughts! Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} About me: Hi, I am John Brunswick, an Oracle Enterprise Architect. As an Oracle Enterprise Architect, I focus on the alignment of technical capabilities in support of business vision and objectives, as well as the overall business value of technology.  Before coming to Oracle, I was a Practice Manager within BEA System's Business Interaction Division consulting organization, orchestrating enterprise systems in support of line of business goals. Connect with me on Twitter and visit my site for Oracle Fusion Middleware related tips.

    Read the article

  • Leveraging .Net 4.0 Framework Tools For Encrypting Web Configuration Sections

    - by Sam Abraham
    I would like to share a few points with regards to encrypting web configuration sections in .Net 4.0. This information is also applicable to .Net 3.5 and 2.0. Two methods can work perfectly for encrypting connection strings in a Web project configuration file:   1-Do It All Yourself! In this approach, helper functions for encrypting/decrypting configuration file content are implemented. Program would explicitly retrieve appropriate content from configuration file then decrypt it appropriately.  Disadvantages of this implementation would be the added overhead for maintaining the encryption/decryption code as well the burden of always ensuring sections are appropriately decrypted before use and encrypted appropriately whenever edited.   2- Leverage the .Net 4.0 Framework (The Way to go!) Fortunately, all needed tools for protecting configuration files are built-in to the .Net 2.0/3.5/4.0 versions with very little setup needed. To encrypt connection strings, one can use the ASP.Net IIS Registration Tool (Aspnet_regiis.exe). Note that a 64-bit version of the tool also exists under the Framework64 folder for 64-bit systems. The command we need to encrypt our web.config file connection strings is simply the following:   Aspnet_regiis –pe “connectionstrings” –app “/sampleApplication” –prov “RsaProtectedConfigurationProvider”   To later decrypt this configuration section:   Aspnet_regiis –pd “connectionstrings” –app “/SampleApplication”   The following is a brief description of the command line options used in the example above. Aspnet_regiis supports many more options which you can read about in the links provided for reference below.   Option Description -pe  Section name to encrypt -pd  Section name to decrypt -app  Web application name -prov  Encryption/Decryption provider   ASP.Net automatically decrypts the content of the Web.Config file at runtime so no programming changes are needed.   Another tool, aspnet_setreg.exe is to be used if certain configuration file sections pertinent to the .Net runtime are to be encrypted. For more information on when and how to use aspnet_setreg, please refer to the references below.   Hope this helps!   Some great references concerning the topic:   http://msdn.microsoft.com/en-us/library/ff650037.aspx http://msdn.microsoft.com/en-us/library/zhhddkxy.aspx http://msdn.microsoft.com/en-us/library/dtkwfdky.aspx http://msdn.microsoft.com/en-us/library/68ze1hb2.aspx

    Read the article

  • Which toolkit to use for 3D MMO game development?

    - by Ahmet Yildirim
    Lately i've been thinking about which path to follow for developing an 3D Online game. I have googled a lot but i couldnt find a good article that covers both game development and online server & client development in same context. This question has been in mind for about 2 weeks now. So.. yesterday i started developing a game from scratch by using Irrlicht.Net Wrapper to use Socket library of .NET which im already familiar. But i found out .Net wrapper of Irrlicht is not totally finished yet and still have lacks from the original. So i lost all my motives :/. So i thought why not to ask the experts before i run into another dead end... What Game Engine and Networking Library is best way to go for 3D MMO Development? Here is some of my early conclusions: Please let me know the ones im wrong. C++: Best Performance for 3D Graphics. Most Game Engines has native C++ Libraries. Lacks a Solid Socket Library .NETC++ Lacks Intellisense Support. C#: Intellisense Support NET Socket Library Lacks 3D Graphics Performance Lacks a native solid 3D Game Engine

    Read the article

  • Issues with SSL key on CentOS

    - by yummm
    When trying to install a SSL key on my centos server, apache refuses to restart and I see the following errors in my log. [Tue Mar 16 22:32:58 2010] [error] Init: Private key not found [Tue Mar 16 22:32:58 2010] [error] SSL Library Error: 218710120 error:0D094068:asn1 encoding routines:d2i_ASN1_SET:bad tag [Tue Mar 16 22:32:58 2010] [error] SSL Library Error: 218529960 error:0D0680A8:asn1 encoding routines:ASN1_CHECK_TLEN:wrong tag [Tue Mar 16 22:32:58 2010] [error] SSL Library Error: 218595386 error:0D07803A:asn1 encoding routines:ASN1_ITEM_EX_D2I:nested asn1 error [Tue Mar 16 22:32:58 2010] [error] SSL Library Error: 218734605 error:0D09A00D:asn1 encoding routines:d2i_PrivateKey:ASN1 lib What exactly does this mean? Is my SSL key bad? If so, what is the correct way to upload the key to the server? I just opened the crt file in notepad and copied the data out and saved it over ssh.

    Read the article

  • Resolving "not found" messages after doing ./configure building node.js

    - by duke
    Hello I am trying to install node.js on debian AMD64. I got node.js from git. When I do ./configure a bunch of "checking for program" messages say "not found". I want to resolve all these and ensure everything needed is present. Can anyone suggest what I need to do to resolve the "not found" messages? Thanks heaps. server:/devel/node# ./configure Checking for program g++ or c++ : /usr/bin/g++ Checking for program cpp : /usr/bin/cpp Checking for program ar : /usr/bin/ar Checking for program ranlib : /usr/bin/ranlib Checking for g++ : ok Checking for program gcc or cc : /usr/bin/gcc Checking for gcc : ok Checking for library dl : yes Checking for library execinfo : not found Checking for openssl : not found Checking for function SSL_library_init : yes Checking for header openssl/crypto.h : yes Checking for library rt : yes --- libeio --- Checking for library pthread : yes Checking for function pthread_create : yes Checking for function pthread_atfork : yes Checking for futimes(2) : yes Checking for readahead(2) : yes Checking for fdatasync(2) : yes Checking for pread(2) and pwrite(2) : yes Checking for sendfile(2) : yes Checking for sync_file_range(2) : yes --- libev --- Checking for header sys/inotify.h : yes Checking for function inotify_init : yes Checking for header sys/epoll.h : yes Checking for function epoll_ctl : yes Checking for header port.h : not found Checking for header poll.h : yes Checking for function poll : yes Checking for header sys/event.h : not found Checking for header sys/queue.h : yes Checking for function kqueue : not found Checking for header sys/select.h : yes Checking for function select : yes Checking for header sys/eventfd.h : not found Checking for SYS_clock_gettime : yes Checking for library rt : yes Checking for function clock_gettime : yes Checking for function nanosleep : yes Checking for function ceil : yes Checking for fdatasync(2) with c++ : yes 'configure' finished successfully (1.479s) server:/devel/node#

    Read the article

  • Starting out with OpenGL when most tutorials are out of date

    - by AUTO
    I'm sure there are already a bunch of questions like this asked, but the constant updating of the OpenGL library throws them all away, and in a month or two, the answers here will be worthless again. I am ready to start programming in OpenGL using C++. I've got a working compiler (DevCpp; do NOT ask me to switch to VC++, and don't ask me why). Now I'm just looking for a solid tutorial on how to program with OpenGL. My assistant found the tutorial provided by NeHe Productions, but as I've come to find out, it's WAY OUT OF DATE! (although I did pull together a basic window to support an OpenGL canvas) Then I went online, and found the OpenGL SuperBible, which apparently uses freeglut? But what I'd like to know is whether or not SuperBible 5th edition is up to date any longer. The suggestion to freeglut I found said the latest version was 2.6.0 but now it's 2.8.0! Is the OpenGL SuperBible still a good, and fairly up-to-date place to start? Is there a better place to go to learn OpenGL? Am I allowed to simply store freeglut in the DevCpp include directory (maybe in GL), or is there some important procedure? Are there any comments or suggestions that I didn't think to ask since I'm only just beginning? @dreta cleared some things up for me, so now I have a better idea of what to ask: I think I'd like to start out with OpenGL using a wrapper library instead of directly accessing OpenGL.I just think that, for a beginner, it would be easier for me to program and get good results, while I don't yet have to understand all the grimy details (as @stephelton mentioned). The problem is, I can't find any library that doesn't have undefined references to no longer supported functions. Freeglut sounds operational, but it still uses GLU.Does anyone know what I can do?Also, I tried compiling the first SuperBible's source, but I got errors since GLAPI is not being defined as a type, the error originating in the GLU library. I'd like to use the SuperBible, but I don't know how to fix this.

    Read the article

  • Windows 7 Media Player won't add MP3 files

    - by AnthonyWJones
    I have a set of MP3 files that Windows Media Player just refuses to add to the library. They are placed in the standard My Music folder. I play them in media player. They just won't be listed in the library. I've tried dragging them and dropping them on the media player but they still don't appear in the library. I have an identical laptop where I've also copied the mp3 files and they appear in the library fine. Any ideas what would cause this?

    Read the article

  • Is there a way to communicate DBMS with raw memory block or binaries

    - by darkcminor
    I am trying to communicate a numerical matrix operations library like LAPACK with any DBMS. Is it possible to send/receive complete matrices as binary or as a direct memory pointers to process them (it will be something like: The Outside library processes data stored in DBMS, then it computes some huge matrix stuff and then via memory block or a binary DBMS get the result from library)? The main purpose is speed and avoid passing through a flat file, and last but not least, use library toefficiently do some operations DBMS are not designed to. * Is it possible that Oracle, SQL Server, MySQL support this technique?.

    Read the article

  • Oracle support note for Leap Second Hang problem that may result into 100% CPU utilization in Linux environment

    - by Anand Akela
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} On or around July 1, 2012, Oracle has become aware of an issue on Linux distributions resulting from the introduction of the leap second; this is causing problems for some customers.  Leap seconds may be introduced at the end of June or December in a calendar year, like 2012, as necessary to maintain time standards. Servers hosting Oracle products which are clients of an NTP server (Network Time Protocol) may be particularly susceptible to this issue as the NTP server is updated. Linux distributions which may be affected include Oracle Enterprise Linux, Red Hat Enterprise Linux, Oracle VM and Oracle Unbreakable Enterprise Kernel. Asianux 2 and 3, based on RHEL 4 and 5, may also be affected. One report of correction to high agent CPU using Note 1472421.1 on SLES11 has also been reported. Not all customers will be affected, but those, who are affected, may observe higher than normal CPU consumption on their Linux environments where JVM's are utilized.  In Oracle Enterprise Manager ( EM ) , this problem can manifest itself as high CPU consumption with the EM Agent process (which runs on a JVM in EM 12c, for instance).  It is possible that the OMS is also affected. We would advise customers to review the description of this problem in MOS Note 1472651.1 and take action if they observe that their environment is affected. Contributed by Andrew Bulloch , Director, Application Systems Management Products

    Read the article

  • Can't find gnutls ibrary when executing rpmbuild under non-root

    - by Rilindo
    I am trying to build ntgs from the latest source, using the .spec from rpmforge - as non-root via rpmbuild. During the compile, it fails at this step: checking for GNUTLS... no configure: error: ntfsprogs crypto code requires the gnutls library. error: Bad exit status from /var/tmp/rpm-tmp.78913 (%build) However, I can compile it successfully outside of rpmbuild. So it sounds like it just the matter of library being seen during the build. However, I can confirm that rpmbuild can see the library that gnutls resides: [foo@bar ~]$ rpmbuild -E '%{_libdir}' rpmbuild/SPECS/ntfsprogs.spec /usr/lib Library location: [foo@bar ntfs-3g_ntfsprogs-2012.1.15]$ /sbin/ldconfig -p | grep -i gnutls libgnutls.so.13 (libc6) => /usr/lib/libgnutls.so.13 libgnutls.so (libc6) => /usr/lib/libgnutls.so libgnutls-openssl.so.13 (libc6) => /usr/lib/libgnutls-openssl.so.13 libgnutls-openssl.so (libc6) => /usr/lib/libgnutls-openssl.so libgnutls-extra.so.13 (libc6) => /usr/lib/libgnutls-extra.so.13 libgnutls-extra.so (libc6) => /usr/lib/libgnutls-extra.so What would cause the problem of the library not being seen when you build a RPM? EDIT: Oh yeah, I am running Centos 5.5.

    Read the article

< Previous Page | 182 183 184 185 186 187 188 189 190 191 192 193  | Next Page >