Search Results

Search found 9233 results on 370 pages for 'building'.

Page 322/370 | < Previous Page | 318 319 320 321 322 323 324 325 326 327 328 329  | Next Page >

  • Who Makes a Good Product Owner

    - by Robert May
    In general, the best product owners are those that care passionately about the customer of the product.  Note that I didn’t say about the product itself.  Actually, people that only care about the product, generally do not make good product owners.  Products only matter in relationship to their customers.  If a product doesn’t provide value to the customer, then the product has no value, no matter what a person might think of the product, and no matter what cool technologies exist inside of the product. A good product owner is also a good negotiator.  They recognize that many different priorities exist inside of a corporation, but that there can be only one list that developers work from.  A good product owner recognizes that its their job to help others around them prioritize (perhaps with a Product Council), but also understand that they alone have the final say about priorities and are willing to make the tough decisions required.  Deciding the priority between two perfectly valid stories is very difficult, especially when the stories are from two different departments! A good product owner is deeply interested in helping the team be successful.  They don’t seek to control the team, but instead seek to understand what the team can do and then work with the team to get the best product possible for the Customer.  A good product owner is never denigrating to team members, ever.  They recognize that such behavior would damage the trust that needs to be present between team members and product owners and will avoid it at all costs. In general, technical people (i.e. former or current developers) make poor product owners.  In their minds, they can’t separate implementation details from user functionality, so their stories end up sounding like implementation details.  For example, “The user enters their username on the password screen” is something that a technical product owner would write.  The proper wording for that story is “A user supplies the system with their credentials.”  Because technical people think different from the rest of the population, they are generally not a good fit. A good product owner is also a good writer.  Writing good stories demands good writing.  The art of persuasion, descriptiveness and just general good grammar are all required.  A good Product Owner must also be well spoken, since most of what will be conveyed will be conveyed with the spoken word, not just written word. A good product owner is a “People Person.”  They like talking to people and are very patient.  They don’t mind having questions repeated or fielding many questions, because they want to make sure that the ideas they’re conveying are properly understood so the customer gets the best product possible.  They are happy to answer any questions a team member may have and invite feedback and criticism of designs and stories, since they want a good product.  They really have little ego that gets in the way of building a great product. All of these qualities can be hard to find, but if you look close enough, you’ll find the right person in your organization.  Product owners can be found anywhere, not just in upper management.  Some of the best product owners are those that are very close to the customer.  In fact, check your customer support staff.  I’d bet that several great product owners are lurking there. Final note about what makes a good product owner.  You’re probably NOT going to find a good product owner in a manager, especially if they consider themselves a “Manager.”  Product owners don’t manage anything but the backlog, so be especially careful if the person you’re selecting for Product Owner is a manager. Up Next, “Messing with the Team.” Technorati Tags: Scrum,Product Owner

    Read the article

  • IE9, HTML5 and truck load of other stuff happening around the web

    - by Harish Ranganathan
    First of all, I haven’t been updating this blog as regularly as it used to be.  Primarily, due to the fact was I was visiting a lot of cities talking about SharePoint, Web Matrix, IE9 and few other stuff.  IE9 is my new found love and I simply think we have done great work in improving the browser and browsing experiences for our users. This post would talk about IE, general things happening around the web and few misconceptions around IE (I had earlier written about IE8 and common myths When you think about the way web has transformed, its truly amazing.  Rewind back to late 90s and early 2000s, web was a luxury.  There were lot of desktop applications running around and web applications was starting to pick up.  Primarily reason was not a lot of folks were into web development and the areas of web were confined to HTML and JavaScript.  CSS was around here and there but no one took it so seriously.  XML, XSLT was fast picking up and contributed to decent web development techniques. So as a web developer all we had to worry about was, building good looking websites which worked well with IE6 and occasionally with Safari.  Firefox was  not even in the picture then and so was Chrome.  But with the various arms of W3C consortium and other bodies working actively on stuff like CSS, SVG and XHTML, few more areas came into picture when it comes to browsers supporting standards.  IE6 for sure wasn’t up to the speed and the main issue we were tackling then was privacy and piracy.  We did invest a lot of our efforts to curb piracy and one of the steps into it was that, IE7 the next version of IE would install only on genuine windows machines.  What this means, is that, people who were running pirated windows xp knowingly/unknowingly could not install IE7 and the limitations of IE6 really hurt them.  One more thing of importance is that, if you were running pirated windows, lots of chances that you didn’t get the security updates and thereby were vulnerable to run viruses/trojans on your system. Many of them actually block using IE in the first place and make it difficult to browse.  SP2 came as a big boon but again was there only for genuine windows machines. With Firefox coming as a free install and also heavily pushed by Google then, it was natural that people would try an alternative.  By then, we had started working on IE8 supporting the best standards (note HTML5, CSS 2.1 and other specs were then work in progress.  they are still) Later, Google in their infinite wisdom realized that with Firefox they were going nowhere and they released Chrome.  Now, they heavily push Chrome even for Firefox users, which is natural since its their browser. In the meanwhile, these browsers push their updates as mandatory and therefore have a very short lifecycle to add enhancements and support for stuff like CSS etc., Meanwhile, when IE8 came out, it really was the best standards supported browser and a lot of people saw our efforts in improving our browser. HTML5 is the buzz word in the industry and there is a lot of noise being made by many browsers claiming their support for it.  IE8 doesn’t have much support for HTML5.  But, with IE9 Beta, we have great support for many of HTML5 specifications.  Note that, HTML5 is still work under progress and one of the board of members working on the spec has mentioned that these specs might change and relying on them heavily is dangerous.  But, some of the advances such as video tag, etc., are indeed supported in IE9 Beta.  IE9 Beta also has full hardware acceleration support which other browsers don’t have. IE8 had advanced security features such as smartscreen filter, in-private browsing, anti-phishing and a lot of other stuff.  IE9 builds on top of these with the best in town security standards as well as support for HTML5, CSS3, Hardware acceleration, SVG and many other advancements in browser.  Read more at http://www.beautyoftheweb.com/#/highlights/html5  To summarize, IE9 Beta is really innovative and you should try it to believe what it provides.  You can visit http://www.beautyoftheweb.com/  to install as well as read more on this. Cheers !!!

    Read the article

  • T-SQL Tuesday #005 : SSRS Parameters and MDX Data Sets

    - by blakmk
    Well it this weeks  T-SQL Tuesday #005  topic seems quite fitting. Having spent the past few weeks creating reports and dashboards in SSRS and SSAS 2008, I was frustrated by how difficult it is to use custom datasets to generate parameter drill downs. It also seems Reporting Services can be quite unforgiving when it comes to renaming things like datasets, so I want to share a couple of techniques that I found useful. One of the things I regularly do is to add parameters to the querys. However doing this causes Reporting Services to generate a hidden dataset and parameter name for you. One of the things I like to do is tweak these hidden datasets removing the ‘ALL’ level which is a tip I picked up from Devin Knight in his blog: There are some rules i’ve developed for myself since working with SSRS and MDX, they may not be the best or only way but they work for me. Rule 1 – Never trust the automatically generated hidden datasets Or even ANY, automatically generated MDX queries for that matter.... I’ve previously blogged about this here.   If you examine the MDX generated in the hidden dataset you will see that it generates the MDX in the context of the originiating query by building a subcube, this mean it may NOT be appropriate to use this in a subsequent query which has a different context. Make sure you always understand what is going on. Often when i’m developing a dashboard or a report there are several parameter oriented datasets that I like to manually create. It can be that I have different datasets using the same dimension but in a different context. One example of this, is that I often use a dataset for last month and a dataset for the last 6 months. Both use the same date hierarchy. However Reporting Services seems not to be too smart when it comes to generating unique datasets when working with and renaming parameters and datasets. Very often I have come across this error when it comes to refactoring parameter names and default datasets. "an item with the same key has already been added" The only way I’ve found to reliably avoid this is to obey to rule 2. Rule 2 – Follow this sequence when it comes to working with Parameters and DataSets: 1.    Create Lookup and Default Datasets in advance 2.    Create parameters (set the datasets for available and default values) 3.    Go into query and tick parameter check box 4.    On dataset properties screen, select the parameter defined earlier from the parameter value defined earlier. Rule 3 – Dont tear your hair out when you have just renamed objects and your report doesn’t build Just use XML notepad on the original report file. I found I gained a good understanding of the structure of the underlying XML document just by using XML notepad. From this you can do a search and find references of the missing object. You can also just do a wholesale search and replace (after taking a backup copy of course ;-) So I hope the above help to save the sanity of anyone who regularly works with SSRS and MDX.   @Blakmk

    Read the article

  • T-SQL in Chicago – the LobsterPot teams with DataEducation

    - by Rob Farley
    In May, I’ll be in the US. I have board meetings for PASS at the SQLRally event in Dallas, and then I’m going to be spending a bit of time in Chicago. The big news is that while I’m in Chicago (May 14-16), I’m going to teach my “Advanced T-SQL Querying and Reporting: Building Effectiveness” course. This is a course that I’ve been teaching since the 2005 days, and have modified over time for 2008 and 2012. It’s very much my most popular course, and I love teaching it. Let me tell you why. For years, I wrote queries and thought I was good at it. I was a developer. I’d written a lot of C (and other, more fun languages like Prolog and Lisp) at university, and then got into the ‘real world’ and coded in VB, PL/SQL, and so on through to C#, and saw SQL (whichever database system it was) as just a way of getting the data back. I could write a query to return just about whatever data I wanted, and that was good. I was better at it than the people around me, and that helped. (It didn’t help my progression into management, then it just became a frustration, but for the most part, it was good to know that I was good at this particular thing.) But then I discovered the other side of querying – the execution plan. I started to learn about the translation from what I’d written into the plan, and this impacted my query-writing significantly. I look back at the queries I wrote before I understood this, and shudder. I wrote queries that were correct, but often a long way from effective. I’d done query tuning, but had largely done it without considering the plan, just inferring what indexes would help. This is not a performance-tuning course. It’s focused on the T-SQL that you read and write. But performance is a significant and recurring theme. Effective T-SQL has to be about performance – it’s the biggest way that a query becomes effective. There are other aspects too though – such as using constructs better. For example – I can write code that modifies data nicely, but if I haven’t learned about the MERGE statement and the way that it can impact things, I’m missing a few tricks. If you’re going to do this course, a good place to be is the situation I was in a few years before I wrote this course. You’re probably comfortable with writing T-SQL queries. You know how to make a SELECT statement do what you need it to, but feel there has to be a better way. You can write JOINs easily, and understand how to use LEFT JOIN to make sure you don’t filter out rows from the first table, but you’re coding blind. The first module I cover is on Query Execution. Take a look at the Course Outline at Data Education’s website. The first part of the first module is on the components of a SELECT statement (where I make you think harder about GROUP BY than you probably have before), but then we jump straight into Execution Plans. Some stuff on indexes is in there too, as is simplification and SARGability. Some of this is stuff that you may have heard me present on at conferences, but here you have me for three days straight. I’m sure you can imagine that we revisit these topics throughout the rest of the course as well, and you’d be right. In the second and third modules we look at a bunch of other aspects, including some of the T-SQL constructs that lots of people don’t know, and various other things that can help your T-SQL be, well, more effective. I’ve had quite a lot of people do this course and be itching to get back to work even on the first day. That’s not a comment about the jokes I tell, but because people want to look at the queries they run. LobsterPot Solutions is thrilled to be partnering with Data Education to bring this training to Chicago. Visit their website to register for the course. @rob_farley

    Read the article

  • SharePoint 2010 Hosting :: Setting Default Column Values on a Folder Programmatically

    - by mbridge
    The reason I write this post today is because my initial searches on the Internet provided me with nothing on the topic.  I was hoping to find a reference to the SDK but I didn’t have any luck.  What I want to do is set a default column value on an existing folder so that new items in that folder automatically inherit that value.  It’s actually pretty easy to do once you know what the class is called in the API.  I did some digging and discovered that class is MetadataDefaults. It can be found in Microsoft.Office.DocumentManagement.dll.  Note: if you can’t find it in the GAC, this DLL is in the 14/CONFIG/BIN folder and not the 14/ISAPI folder.  Add a reference to this DLL in your project.  In my case, I am building a console application, but you might put this in an event receiver or workflow. In my example today, I have simple custom folder and document content types.  I have one shared site column called DocumentType.  I have a document library which each of these content types registered.  In my document library, I have a folder named Test and I want to set its default column values using code.  Here is what it looks like.  Start by getting a reference to the list in question.  This assumes you already have a SPWeb object.  In my case I have created it and it is called site. SPList customDocumentLibrary = site.Lists["CustomDocuments"]; You then pass the SPList object to the MetadataDefaults constructor. MetadataDefaults columnDefaults = new MetadataDefaults(customDocumentLibrary); Now I just need to get my SPFolder object in question and pass it to the meethod SetFieldDefault.  This takes a SPFolder object, a string with the name of the SPField to set the default on, and finally the value of the default (in my case “Memo”). SPFolder testFolder = customDocumentLibrary.RootFolder.SubFolders["Test"]; columnDefaults.SetFieldDefault(testFolder, "DocumentType", "Memo"); You can set multiple defaults here.  When you’re done, you will need to call .Update(). columnDefaults.Update(); Here is what it all looks like together. using (SPSite siteCollection = new SPSite("http://sp2010/sites/ECMSource")) {     using (SPWeb site = siteCollection.OpenWeb())     {         SPList customDocumentLibrary = site.Lists["CustomDocuments"];         MetadataDefaults columnDefaults = new MetadataDefaults(customDocumentLibrary);          SPFolder testFolder = customDocumentLibrary.RootFolder.SubFolders["Test"];         columnDefaults.SetFieldDefault(testFolder, "DocumentType", "Memo");         columnDefaults.Update();     } } You can verify that your property was set correctly on the Change Default Column Values page in your list This is something that I could see used a lot on an ItemEventReceiver attached to a folder to do metadata inheritance.  Whenever, the user changed the value of the folder’s property, you could have it update the default.  Your code might look something columnDefaults.SetFieldDefault(properties.ListItem.Folder, "MyField", properties.ListItem[" This is a great way to keep the child items updated any time the value a folder’s property changes.  I’m also wondering if this can be done via CAML.  I tried saving a site template, but after importing I got an error on the default values page.  I’ll keep looking and let you know what I find out.

    Read the article

  • Python Coding standards vs. productivity

    - by Shroatmeister
    I work for a large humanitarian organisation, on a project building software that could help save lives in emergencies by speeding up the distribution of food. Many NGOs desperately need our software and we are weeks behind schedule. One thing that worries me in this project is what I think is an excessive focus on coding standards. We write in python/django and use a version of PEP0008, with various modifications e.g. line lengths can go up to 160 chars and all lines should go that long if possible, no blank lines between imports, line wrapping rules that apply only to certain kinds of classes, lots of templates that we must use, even if they aren't the best way to solve a problem etc. etc. One core dev spent a week rewriting a major part of the system to meet the then new coding standards, throwing away several suites of tests in the process, as the rewrite meant they were 'invalid'. We spent two weeks rewriting all the functionality that was lost, and fixing bugs. He is the lead dev and his word carries weight, so he has convinced the project manager that these standards are necessary. The junior devs do as they are told. I sense that the project manager has a strong feeling of cognitive dissonance about all this but nevertheless agrees with it vehemently as he feels unsure what else to do. Today I got in serious trouble because I had forgotten to put some spaces after commas in a keyword argument. I was literally shouted at by two other devs and the project manager during a Skype call. Personally I think coding standards are important but also think that we are wasting a lot of time obsessing with them, and when I verbalized this it provoked rage. I'm seen as a troublemaker in the team, a team that is looking for scapegoats for its failings. Since the introduction of the coding standards, the team's productivity has measurably plummeted, however this only reinforces the obsession, i.e. the lead dev simply blames our non-adherence to standards for the lack of progress. He believes that we can't read each other's code if we don't adhere to the conventions. This is starting to turn sticky. Now I am trying to modify various scripts, autopep8, pep8ify and PythonTidy to try to match the conventions. We also run pep8 against source code but there are so many implicit amendments to our standard that it's hard to track them all. The lead dev simple picks faults that the pep8 script doesn't pick up and shouts at us in the next stand-up meeting. Every week there are new additions to the coding standards that force us to rewrite existing, working, tested code. Thank heavens we still have tests, (I reverted some commits and fixed a bunch of the ones he removed). All the while there is increasing pressure to meet the deadline. I believe a fundamental issue is that the lead dev and another core dev refuse to trust other developers to do their job. But how to deal with that? We can't do our job because we are too busy rewriting everything. I've never encountered this dynamic in a software engineering team. Am I wrong to question their adherence to coding standards? Has anyone else experienced a similar situation and how have they dealt with it successfully? (I'm not looking for a discussion just actual solutions people have found)

    Read the article

  • My Experience at Oracle !!! By Ayush Gupta

    - by Nadiya
    Normal 0 false false false EN-US X-NONE X-NONE Hi! My name is Ayush, a Gratuate from BITS Pilani and now working and living in Bangalore. I joined Oracle in August 2013 as a Senior Consultant (SC) and would like to share my experiences over the first couple of months with you.It has been a wonderful journey so far. The last two months have been very exciting for me. First of all I would like to mention that the training program at Oracle that we went through really prepared us well. It matured us and allowed us to go from developing small applications in college to big enterprise products. Two months of initial training has had a lasting impact for me. I am also really enjoying the knowledge I have gained so far and also learning new things in the form of product training. It's really fun to work here. We are treated like adults and we are responsible for our own workloads.With that I can't keep from mentioning the fun times we as a team have in the form of Young Leadership programme in Hotel Fortune Trinity which included Luxurious buffet lunch too. Wishing it could happen more frequently.  Oracle provides one of the best opportunities to learn various technologies across different platforms. What I like best about working at Oracle is the work life balance. With the option of flexible timings, one can easily enjoy planned evenings with friends or maybe working out at the fitness centre in your building. Be it the birthday celebrations at office or the day long team outing at a resort, It’s all together a different experience. Overall, you get to take full ownership of your project and they give you a free leash on how you design your enhancements/changes.As one of the largest international companies, Oracle is obviously an expert on exploring the potential and possibility of inexperienced new hires. We were taught how to make an outstanding team work in a group training session at the first few weeks. From this experience I realized that perfect cooperation is not about where you come from or what your study background is, everyone can find his or her own role to support the team. Even though I am not that skilled in technology, my background has significantly helped me in learning new technologies in Oracle.My idea and suggestion is: for new joiners, the will to learn is be more important than what you have learnt before. Colleagues here at Oracle are professionals in their field, always friendly and glad to help. So don’t worry, all you need to do is just be confident, and have a nice attitude, Oracle will let you fully display your talent. Come and join us, here you can always find a tailor made role for you! /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • ArchBeat Link-o-Rama Top 10 for August 2012

    - by Bob Rhubart
    The Top 10 most popular items shared via the OTN ArchBeat Facebook page for the month of August 2012. Now Available: Oracle SQL Developer 3.2 (3.2.09.23) New features include APEX listener, UI enhancements, and 12c database support. The Role of Oracle VM Server for SPARC in a Virtualization Strategy In this article, Matthias Pfutzner discusses hardware, desktop, and operating system virtualization, along with various Oracle virtualization technologies, including Oracle VM Server for SPARC. How to Manually Install Flash Player Plugin to see the Oracle Enterprise Manager Performance Page | Kai Yu So, you're a DBA and you want to check the Performance page in Oracle Enterprise Manager (11g or 12c). So you click the Performance tab and… nothing. Zip. Nada. The Flash plugin is a no-show. Relax! Oracle ACE Director Kai Yu shows you what you need to do to see all the pretty colors instead of that dull grey screen. Relationally Challenged (CX - CRM - EQ/RQ/CRQ) | Chris Warticki Self-proclaimed Oracle Support "spokesmodel" Chris Chris Warticki has some advice for those interested in Customer Relationship Management: "How about we just dumb it down, strip it to the core, keep it simple and LISTEN?! No more focus groups, no more surveys, and no need to gather more data. We have plenty of that. Why not just provide the customer what they are asking for?" Free WebLogic Server Course | Middleware Magic So you want to sharpen your Oracle WebLogic Server skills, but you prefer to skip the whole classroom bit and don't want to be bothered with dealing with an instructor? No problem! Oracle ACE Rene van Wijk, a prolific Middleware Magic blogger, has information on an Oracle WebLogic course you can take on your own time, at your own pace. Oracle VM VirtualBox 4.1.20 released Oracle VM VirtualBox 4.1.20 was just released at the community and Oracle download sites, reports the Fat Bloke. This is a maintenance release containing bug fixes and stability improvements. Optimizing OLTP Oracle Database Performance using Dell Express Flash PCIe SSDs | Kai Yu Oracle ACE Director Kai Yu shares resources based on "several extensive performance studies on a single node Oracle 11g R2 database as well as a two node 11gR2 Oracle Real Application clusters (RAC) database running on Dell PowerEdge R720 servers with Dell Express Flash PCIe SSDs on Oracle Enterprise Linux 6.2 platform." Oracle ACE sessions at Oracle OpenWorld With so many great sessions at this year's event, building your Oracle OpenWorld schedule can involve making a lot of tough choices. But you'll find that the sessions led by Oracle ACEs just might be the icing on the cake for your OpenWorld experience. MySQL Update: The Cleveland MySQL Meetup (Independence, OH) Oracle MySQL team member Benjamin Wood, a MySQL engineer and five year veteran of the MySQL organization, will speak at the Cleveland MySQL Meetup event on September 12. The presentation will include a MySQL 5.5 Overview, Oracle's Roadmap for MySQL, including specifics on MySQL 5.6, best practices and how to overcome development and operational MySQL challenges, and the new MySQL commercial extensions. Click the link for time and location information. Parsing XML in Oracle Database | Martijn van der Kamp Martijn van der Kamp's post deals with processing XML in PL/SQL code and processing the data into the database. Thought for the Day "Walking on water and developing software from a specification are easy if both are frozen." — Edward V. Berard Source: SoftwareQuotes.com

    Read the article

  • Silverlight Cream for December 12, 2010 - 2 -- #1009

    - by Dave Campbell
    In this Issue: Michael Crump, Jesse Liberty, Shawn Wildermuth, Domagoj Pavlešic, Peter Kuhn, James Ashley, Sara Summers, Morten Nielsen, Peter Torr, and Tau Sick. Above the Fold: Silverlight: "Silverlight 4 – Coded UI Framework Video Tutorial" Michael Crump WP7: "Windows Phone From Scratch #12–Custom Behaviors (Part I)" Jesse Liberty From SilverlightCream.com: Silverlight 4 – Coded UI Framework Video Tutorial Michael Crump posted a video tutorial today on the Coded UI Test Framework that we got with the VS2010 Feature Pack 2. Wanna create automated tests? ... check out Michael's video and save yourself some time. Windows Phone From Scratch #12–Custom Behaviors (Part I) Jesse Liberty posted his Windows Phone from Scratch number 12 today... and it's on Custom Behaviors... cool stuff... need to read this and get your head around it... this is part 1, jump on it before he drops part 2 on us! The Next Application Platform? All of them... Shawn Wildermuth has a thought-provoking post up ... check it out and see if you're ready to join him on the adventure of building for all the platforms... Windows Phone 7 Accelerometer Test App Domagoj Pavlešic has a test app up for the accelerometer on the WP7 ... if you need to use it, and are having problems, a good example always helps me. Protocol of developing an animation texture tool Peter Kuhn found a need for a tool to creat some animations for an WP7 XNA game... so he challenged himself to write it, and detailed out all his steps as he went. Re-examining WP7 Launchers and Choosers James Ashley's most recent post is on the Pivot Control ... check this out... add a working Horizontally oriented slider to a pivot... plus some external links to help out New Prototyping Sketch Sheets for WP7 This is one of those posts that I had to go to SilverlightCream and make sure I hadn't hit it yet... pretty cool prototype sheets for WP7 by Sara Summers ... we've seen others, they're all good. Simulating GPS on Windows Phone 7 Morten Nielsen helps you get around the fact that you're not going to be able to use the emulator for testing your GPS app ... at least not without some assistance... and that doesn't mean hauling your dev system around your neighborhood, either. How to correctly handle application deactivation and reactivation We've seen posts on Tombstoning, but probably not from Silverlight team members... check this one out from Peter Torr ... great even sequence information and all the info on how to correctly handle it, plus external links to the documentation... you knew there was documentation, right? :) Localizing a Windows Phone 7 Application Tau Sick has a post up discussing Localization and your WP7 apps... coming from soneone with an app in the marketplace in 3 languages, it's a pretty good bet he's got it figured out! Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • SSRS Parameters and MDX Data Sets

    - by blakmk
    Having spent the past few weeks creating reports and dashboards in SSRS and SSAS 2008, I was frustrated by how difficult it is to use custom datasets to generate parameter drill downs. It also seems Reporting Services can be quite unforgiving when it comes to renaming things like datasets, so I want to share a couple of techniques that I found useful. One of the things I regularly do is to add parameters to the querys. However doing this causes Reporting Services to generate a hidden dataset and parameter name for you. One of the things I like to do is tweak these hidden datasets removing the ‘ALL’ level which is a tip I picked up from Devin Knight in his blog: There are some rules i’ve developed for myself since working with SSRS and MDX, they may not be the best or only way but they work for me. Rule 1 – Never trust the automatically generated hidden datasets Or even ANY, automatically generated MDX queries for that matter.... I’ve previously blogged about this here.   If you examine the MDX generated in the hidden dataset you will see that it generates the MDX in the context of the originiating query by building a subcube, this mean it may NOT be appropriate to use this in a subsequent query which has a different context. Make sure you always understand what is going on. Often when i’m developing a dashboard or a report there are several parameter oriented datasets that I like to manually create. It can be that I have different datasets using the same dimension but in a different context. One example of this, is that I often use a dataset for last month and a dataset for the last 6 months. Both use the same date hierarchy. However Reporting Services seems not to be too smart when it comes to generating unique datasets when working with and renaming parameters and datasets. Very often I have come across this error when it comes to refactoring parameter names and default datasets. "an item with the same key has already been added" The only way I’ve found to reliably avoid this is to obey to rule 2. Rule 2 – Follow this sequence when it comes to working with Parameters and DataSets: 1.    Create Lookup and Default Datasets in advance 2.    Create parameters (set the datasets for available and default values) 3.    Go into query and tick parameter check box 4.    On dataset properties screen, select the parameter defined earlier from the parameter value defined earlier. Rule 3 – Dont tear your hair out when you have just renamed objects and your report doesn’t build Just use XML notepad on the original report file. I found I gained a good understanding of the structure of the underlying XML document just by using XML notepad. From this you can do a search and find references of the missing object. You can also just do a wholesale search and replace (after taking a backup copy of course ;-) So I hope the above help to save the sanity of anyone who regularly works with SSRS and MDX.

    Read the article

  • Getting Started with StreamInsight 2.1

    - by Roman Schindlauer
    If you're just beginning to get familiar with StreamInsight, you may be looking for a way to get started. What are the basics? How can I get my first StreamInsight application running so I can see how it works? Where is the 'front door' that will get me going? If that describes you, then this blog entry might be just what you need. If you're already a StreamInsight wiz, keep reading anyway - you may find some helpful links here that you weren't aware of. But here's what we'd like from you experienced readers in particular: if you know of other good resources that we missed, please feel free to add them in the comments below. We appreciate you sharing your expertise. The Book The basic documentation for StreamInsight is located in the MSDN Library (Microsoft StreamInsight 2.1). You'll notice that previous versions of StreamInsight are still there (1.2 and 2.0), but if you're just getting started you can stick to the 2.1 section. The documentation has been organized to function as reference material, which is fine after you're familiar with the technology. But if you're trying to learn the basics, you might want to take a different path instead of just starting at the top. The following is one map you can use. What Is StreamInsight? Here is a sequence of topics that should give you a good overview of what StreamInsight is and how it works: Overview answers the question, "what is it?" StreamInsight Server Architecture gives you a quick look at a high-level architectural drawing StreamInsight Concepts lays out an overview of the basic components Deploying StreamInsight Entities to a StreamInsight Server describes the mechanics of how these components work together Getting an Example Running Once you have this background, go ahead and install StreamInsight and get a basic example up and running: Installation download and install the software StreamInsight Examples walk through a set of 3 simple StreamInsight applications that work together to demonstrate what you learned in the topics above; you can copy and paste the code into Visual Studio, compile, and run That's it - you now have a real, functioning StreamInsight system! Now that you have a handle on the basics, you might want to start digging deeper. Digging Deeper Here's a suggested path through the documentation to help you understand the next layer of StreamInsight technologies: Using Event Sources and Event Sinks sources supply data and sinks consume it; this topic gives you an overview of how they work Publishing and Connecting to the StreamInsight Server practical details on how to set up a StreamInsight server A Hitchhiker’s Guide to StreamInsight 2.1 Queries queries are the heart of how StreamInsight performs data analytics, and this whitepaper will help you really understand how they work Using StreamInsight LINQ root through this section for technical details on specific query components Using the StreamInsight Event Flow Debugger in addition to troubleshooting, the debugger is a great way to learn more about what goes on inside a StreamInsight application And Even Deeper Finally, to get a handle on some of the more complex things you can do with StreamInsight, dig into these: Input and Output Adapters adapters can be useful for handling more complex sources and sinks Building Resilient StreamInsight Applications a resilient application is able to recover from system failures Operations this section will help you monitor and troubleshoot a running StreamInsight system The StreamInsight Community As you're designing and developing your StreamInsight solutions, you probably will find it helpful to see working examples or to learn tips and tricks from others. Or maybe you need a place to post a vexing question. Here are some community resources that we have found useful. If you know of others, please add them in the comments below. Code samples and tools Official StreamInsight code samples Introduction to LinqPad Driver for StreamInsight 2.1 - LinqPad is a very useful tool for developing queries The following case studies are based on earlier versions of StreamInsight, but they still are useful examples: Microsoft Media Analytics - real-time monitoring and analytic Edgenet - responding to information from multiple source ICONICS - managing energy usage Blogs Microsoft StreamInsight Ruminations of J.net Richard Seroter's Architecture Musings pluralsight Forums MSDN StreamInsight Forum stackoverflow Training Microsoft StreamInsight Fundamentals (“Introducing StreamInsight” is free) from pluralsight Twitter @streaminsight   You’re a StreamInsight Expert That should get you going. Please add any other resources you have found useful in the comments below.   Regards, The StreamInsight Team

    Read the article

  • Inside Red Gate - Experimenting In Public

    - by Simon Cooper
    Over the next few weeks, we'll be performing experiments on SmartAssembly to confirm or refute various hypotheses we have about how people use the product, what is stopping them from using it to its full extent, and what we can change to make it more useful and easier to use. Some of these experiments can be done within the team, some within Red Gate, and some need to be done on external users. External testing Some external testing can be done by standard usability tests and surveys, however, there are some hypotheses that can only be tested by building a version of SmartAssembly with some things in the UI or implementation changed. We'll then be able to look at how the experimental build is used compared to the 'mainline' build, which forms our baseline or control group, and use this data to confirm or refute the relevant hypotheses. However, there are several issues we need to consider before running experiments using separate builds: Ideally, the user wouldn't know they're running an experimental SmartAssembly. We don't want users to use the experimental build like it's an experimental build, we want them to use it like it's the real mainline build. Only then will we get valid, useful, and informative data concerning our hypotheses. There's no point running the experiments if we can't find out what happens after the download. To confirm or refute some of our hypotheses, we need to find out how the tool is used once it is installed. Fortunately, we've applied feature usage reporting to the SmartAssembly codebase itself to provide us with that information. Of course, this then makes the experimental data conditional on the user agreeing to send that data back to us in the first place. Unfortunately, even though this does limit the amount of useful data we'll be getting back, and possibly skew the data, there's not much we can do about this; we don't collect feature usage data without the user's consent. Looks like we'll simply have to live with this. What if the user tries to buy the experiment? This is something that isn't really covered by the Lean Startup book; how do you support users who give you money for an experiment? If the experiment is a new feature, and the user buys a license for SmartAssembly based on that feature, then what do we do if we later decide to pivot & scrap that feature? We've either got to spend time and money bringing that feature up to production quality and into the mainline anyway, or we've got disgruntled customers. Either way is bad. Again, there's not really any good solution to this. Similarly, what if we've removed some features for an experiment and a potential new user downloads the experimental build? (As I said above, there's no indication the build is an experimental build, as we want to see what users really do with it). The crucial feature they need is missing, causing a bad trial experience, a lost potential customer, and a lost chance to help the customer with their problem. Again, this is something not really covered by the Lean Startup book, and something that doesn't have a good solution. So, some tricky issues there, not all of them with nice easy answers. Turns out the practicalities of running Lean Startup experiments are more complicated than they first seem!

    Read the article

  • Free tools versus paid tools.

    - by Dennis Vroegop
    We live in a strange world. Information should be free. Tools should be free. Software should be free (and I mean free as in free beer, not as in free speech). Of course, since I make my living (and pay my mortgage) by writing software I tend to disagree. Or rather: I want to get paid for the things I do in the daytime. Next to that I also spend time on projects I feel are valuable for the community, which I do for free. The reason I can do that is because I get paid enough in the daytime to afford that time. It gives me a good feeling, I help others and it’s fun to do. But the baseline is: I get paid to write software. I am sure this goes for a lot of other developers. We get paid for what we do during the daytime and spend our free time giving back. So why does everyone always make a fuzz when a company suddenly starts to charge for software? To me, this seems like a very reasonable decision. Companies need money: they have staff to pay, buildings to rent, coffee to buy, etc. All of this doesn’t come free so it makes sense that they charge their customers for the things they produce. I know there’s a very big Open Source market out there, where companies give away (parts of) their software and get revenue out of the services they provide. But this doesn’t work if your product doesn’t need services. If you build a great tool that is very easy to use, and you give it away for free you won’t get any money by selling services that no user of your tool really needs. So what do you do? You charge money for your tool. It’s either that or stop developing the tool and turn to other, more profitable projects. Like it or not, that’s simple economics at work. You have something other people want, so you charge them for it. This week it was announced that what I believe is the most used tool for .net developers (besides Visual Studio of course),namely Red Gates .net reflector, will stop being a free tool. They will charge you $35 for the next version. Suddenly twitter was on fire and everyone was mad about it. But why? The tool is downloaded by so many developers that it must be valuable to them. I know of no serious .net developer who hasn’t got it on his or her machine. So apparently the tool gives them something they need. So why do they expect it to be free? There are developers out there maintaining and extending the tool, building new and better versions of it. And the price? $35 doesn’t seem much. If I think of the time the tool saved me the 35 dollars were earned back in a day. If by spending this amount of money I can rely on great software that helps me do my job better and faster, I have no problems by spending it. I know that there is a great team behind it, (the Red Gate tools are a must have when developing SQL systems, for instance), and I do believe they are in their right to charge this. So.. there you have it. This is of course, my opinion. You may think otherwise. Please let me know in the comments what you think! Tags van Technorati: redgate,reflector,opensource

    Read the article

  • How to develop RPG Damage Formulas?

    - by user127817
    I'm developing a classical 2d RPG (in a similar vein to final fantasy) and I was wondering if anyone had some advice on how to do damage formulas/links to resources/examples? I'll explain my current setup. Hopefully I'm not overdoing it with this question, and I apologize if my questions is too large/broad My Characters stats are composed of the following: enum Stat { HP = 0, MP = 1, SP = 2, Strength = 3, Vitality = 4, Magic = 5, Spirit = 6, Skill = 7, Speed = 8, //Speed/Agility are the same thing Agility = 8, Evasion = 9, MgEvasion = 10, Accuracy = 11, Luck = 12, }; Vitality is basically defense to physical attacks and spirit is defense to magic attacks. All stats have fixed maximums (9999 for HP, 999 for MP/SP and 255 for the rest). With abilities, the maximums can be increased (99999 for HP, 9999 for HP/SP, 999 for the rest) with typical values (at level 100) before/after abilities+equipment+etc will be 8000/20,000 for HP, 800/2000 for SP/MP, 180/350 for other stats Late game Enemy HP will typically be in the lower millions (with a super boss having the maximum of ~12 million). I was wondering how do people actually develop proper damage formulas that scale correctly? For instance, based on this data, using the damage formulas for Final Fantasy X as a base looked very promising. A full reference here http://www.gamefaqs.com/ps2/197344-final-fantasy-x/faqs/31381 but as a quick example: Str = 127, 'Attack' command used, enemy Def = 34. 1. Physical Damage Calculation: Step 1 ------------------------------------- [{(Stat^3 ÷ 32) + 32} x DmCon ÷16] Step 2 ---------------------------------------- [{(127^3 ÷ 32) + 32} x 16 ÷ 16] Step 3 -------------------------------------- [{(2048383 ÷ 32) + 32} x 16 ÷ 16] Step 4 --------------------------------------------------- [{(64011) + 32} x 1] Step 5 -------------------------------------------------------- [{(64043 x 1)}] Step 6 ---------------------------------------------------- Base Damage = 64043 Step 7 ----------------------------------------- [{(Def - 280.4)^2} ÷ 110] + 16 Step 8 ------------------------------------------ [{(34 - 280.4)^2} ÷ 110] + 16 Step 9 ------------------------------------------------- [(-246)^2) ÷ 110] + 16 Step 10 ---------------------------------------------------- [60516 ÷ 110] + 16 Step 11 ------------------------------------------------------------ [550] + 16 Step 12 ---------------------------------------------------------- DefNum = 566 Step 13 ---------------------------------------------- [BaseDmg * DefNum ÷ 730] Step 14 --------------------------------------------------- [64043 * 566 ÷ 730] Step 15 ------------------------------------------------------ [36248338 ÷ 730] Step 16 ------------------------------------------------- Base Damage 2 = 49655 Step 17 ------------ Base Damage 2 * {730 - (Def * 51 - Def^2 ÷ 11) ÷ 10} ÷ 730 Step 18 ---------------------- 49655 * {730 - (34 * 51 - 34^2 ÷ 11) ÷ 10} ÷ 730 Step 19 ------------------------- 49655 * {730 - (1734 - 1156 ÷ 11) ÷ 10} ÷ 730 Step 20 ------------------------------- 49655 * {730 - (1734 - 105) ÷ 10} ÷ 730 Step 21 ------------------------------------- 49655 * {730 - (1629) ÷ 10} ÷ 730 Step 22 --------------------------------------------- 49655 * {730 - 162} ÷ 730 Step 23 ----------------------------------------------------- 49655 * 568 ÷ 730 Step 24 -------------------------------------------------- Final Damage = 38635 I simply modified the dividers to include the attack rating of weapons and the armor rating of armor. Magic Damage is calculated as follows: Mag = 255, Ultima is used, enemy MDef = 1 Step 1 ----------------------------------- [DmCon * ([Stat^2 ÷ 6] + DmCon) ÷ 4] Step 2 ------------------------------------------ [70 * ([255^2 ÷ 6] + 70) ÷ 4] Step 3 ------------------------------------------ [70 * ([65025 ÷ 6] + 70) ÷ 4] Step 4 ------------------------------------------------ [70 * (10837 + 70) ÷ 4] Step 5 ----------------------------------------------------- [70 * (10907) ÷ 4] Step 6 ------------------------------------ Base Damage = 190872 [cut to 99999] Step 7 ---------------------------------------- [{(MDef - 280.4)^2} ÷ 110] + 16 Step 8 ------------------------------------------- [{(1 - 280.4)^2} ÷ 110] + 16 Step 9 ---------------------------------------------- [{(-279.4)^2} ÷ 110] + 16 Step 10 -------------------------------------------------- [(78064) ÷ 110] + 16 Step 11 ------------------------------------------------------------ [709] + 16 Step 12 --------------------------------------------------------- MDefNum = 725 Step 13 --------------------------------------------- [BaseDmg * MDefNum ÷ 730] Step 14 --------------------------------------------------- [99999 * 725 ÷ 730] Step 15 ------------------------------------------------- Base Damage 2 = 99314 Step 16 ---------- Base Damage 2 * {730 - (MDef * 51 - MDef^2 ÷ 11) ÷ 10} ÷ 730 Step 17 ------------------------ 99314 * {730 - (1 * 51 - 1^2 ÷ 11) ÷ 10} ÷ 730 Step 18 ------------------------------ 99314 * {730 - (51 - 1 ÷ 11) ÷ 10} ÷ 730 Step 19 --------------------------------------- 99314 * {730 - (49) ÷ 10} ÷ 730 Step 20 ----------------------------------------------------- 99314 * 725 ÷ 730 Step 21 -------------------------------------------------- Final Damage = 98633 The problem is that the formulas completely fall apart once stats start going above 255. In particular Defense values over 300 or so start generating really strange behavior. High Strength + Defense stats lead to massive negative values for instance. While I might be able to modify the formulas to work correctly for my use case, it'd probably be easier just to use a completely new formula. How do people actually develop damage formulas? I was considering opening excel and trying to build the formula that way (mapping Attack Stats vs. Defense Stats for instance) but I was wondering if there's an easier way? While I can't convey the full game mechanics of my game here, might someone be able to suggest a good starting place for building a damage formula? Thanks

    Read the article

  • Moving the Oracle User Experience Forward with the New Release 7 Simplified UI for Oracle Sales Cloud

    - by mvaughan
    By Kathy Miedema, Oracle Applications User ExperienceIn September 2013, Release 7 for Oracle Cloud Applications became generally available for Oracle Sales Cloud and HCM Cloud. This significant release allowed the Oracle Applications User Experience (UX) team to finally talk freely about Simplified UI, a user experience project in the works since Oracle OpenWorld 2012. Simplified UI represents the direction that the Oracle user experience – for all of its enterprise applications – is heading. Oracle’s Apps UX team began by building a Simplified UI for sales representatives. You can find that today in Release 7, and it was demoed extensively during OpenWorld 2013 in San Francisco. This screenshot shows how Opportunities appear in the new Simplified UI for Oracle Sales Cloud, a user interface built for sales reps.Analyst Rebecca Wettemann, vice president of Nucleus Research, saw Simplified UI at Oracle Openworld 2013 and talked about it with CRM Buyer in “Oracle Revs Its Cloud Engines for a Better Customer Experience.” Wettemann said there are distinct themes to the latest release: "One is usability. Oracle Sales Cloud, for example, is designed to have zero training for onboarding sales reps, which it does," she explained. "It is quite impressive, actually -- the intuitive nature of the application and the design work they have done with this goal in mind."The software uses as few buttons and fields as possible, she pointed out. "The sales rep doesn't have to ask, 'what is the next step?' because she can see what it is."In fact, there are three themes driving the usability that Wettemann noted. They are simplicity, mobility, and extensibility, and we write more about them on the Usable Apps web site. These three themes embody the strategy for Oracle’s cloud applications user experiences.  Simplified UI for Oracle Sales CloudIn developing a Simplified UI for Oracle Sales Cloud, Oracle’s UX team concentrated on the tasks that sales reps need to do most frequently, and are most important. “Knowing that the majority of their work lives are spent on the road and on the go, they need to be able to quickly get in and qualify and convert their leads, monitor and progress their opportunities, update their customer and contact information, and manage their schedule,” Jeremy Ashley, Vice President of the Applications UX team, said.Ashley said the Apps UX team has a good reason for creating a Simplified UI that focuses on self-service. “Sales people spend the day selling stuff,” he said. “The only reason they use software is because the company wants to track what they’re doing.” Traditional systems of tracking that information include filling in a spreadsheet of leads or sales. Oracle wants to automate this process for the salesperson, and enable that person to keep everyone who needs to know up-to-date easily and quickly. Simplified UI addresses that problem by providing light-touch input.  “It has to be useful to the salesperson,” Ashley said about the Sales Cloud user experience. Simplified UI can tell sales reps about key opportunities, or provide information about a contact in just a click or two. Customer information is accessible quickly and easily with Simplified UI for the Oracle Sales Cloud.Simplified UI for Sales Cloud can also be extended easily, Ashley said. Users usually just need to add various business fields or create and modify analytical reports. The way that Simplified UI is constructed allows extensibility to happen by hiding or showing a few necessary fields. The Settings user interface, starting in release 7, allows for the simple configuration of the most important visual elements. “With Sales cloud, we identified a need to make the application useful and very simple,” Ashley said. Simplified UI meets that need. Where can you find out more?To find out more about the simplified UI and Oracle’s ongoing investment in applications user experience innovations, come to one of our sessions at a user group conference near you. Stay tuned to the Voice of User Experience (VoX) blog – the next post will be about Simplified UI and HCM Cloud.

    Read the article

  • What Counts For a DBA: Imagination

    - by drsql
    "Imagination…One little spark, of inspiration… is at the heart, of all creation." – From the song "One Little Spark", by the Sherman Brothers I have a confession to make. Despite my great enthusiasm for databases and programming, it occurs to me that every database system I've ever worked on has been, in terms of its inputs and outputs, downright dull. Most have been glorified e-spreadsheets, many replacing manual systems built on actual spreadsheets. I've created a lot of database-driven software whose main job was to "count stuff"; phone calls, web visitors, payments, donations, pieces of equipment and so on. Sometimes, instead of counting stuff, the database recorded values from other stuff, such as data from sensors or networking devices. Yee hah! So how do we, as DBAs, maintain high standards and high spirits when we realize that so much of our work would fail to raise the pulse of even the most easily excitable soul? The answer lies in our imagination. To understand what I mean by this, consider a role that, in terms of its output, offers an extreme counterpoint to that of the DBA: the Disney Imagineer. Their job is to design Disney's Theme Parks, of which I'm a huge fan. To me this has always seemed like a fascinating and exciting job. What must an Imagineer do, every day, to inspire the feats of creativity that are so clearly evident in those spectacular rides and shows? Here, if ever there was one, is a role where "dull moments" must be rare indeed, surely? I wanted to find out, and so parted with a considerable sum of money for my wife and I to have lunch with one; I reasoned that if I found one small way to apply their secrets to my own career, it would be money well spent. Early in the conversation with our Imagineer (Cindy Cote), the job did indeed sound magical. However, as talk turned to management meetings, budget-wrangling and insane deadlines, I came to the strange realization that, in fact, her job was a lot more like mine than I would ever have guessed. Much like databases, all those spectacular Disney rides bring with them a vast array of complex plumbing, lighting, safety features, and all manner of other "boring bits", kept well out of sight of the end user, but vital for creating the desired experience; and, of course, it is these "boring bits" that take up much of the Imagineer's time. Naturally, there is still a vital part of their job that is spent testing out new ideas, putting themselves in the place of a park visitor, from a 9-year-old boy to a 90-year-old grandmother, and trying to imagine what experiences they'd like to have. It is these small, but vital, sparks of imagination and creativity that have the biggest impact. The real feat of a successful Imagineer is clearly to never to lose sight of this fact, in among all the rote tasks. It is the same for a DBA. Not matter how seemingly dull is the task at hand, try to put yourself in the shoes of the end user, and imagine how your input will affect the experience he or she will have with the database you're building, and how that may affect the world beyond the bits stored in your database. Then, despite the inevitable rush to be "done", find time to go the extra mile and hone the design so that it delivers something as close to that imagined experience as you can get. OK, our output still can't and won't reach the same spectacular heights as the "Journey into The Imagination" ride at EPCOT Theme Park in Orlando, where I first heard "One Little Spark". However, our imaginative sparks and efforts can, and will, make a difference to the user who now feels slightly more at home with a database application, or to the manager holding a report presented with enough clarity to drive an interesting decision or two. They are small victories, but worth having, and appreciated, or at least that's how I imagine it.

    Read the article

  • Oracle Data Integration 12c: Perspectives of Industry Experts, Customers and Partners

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 As you may have seen from our recent blog posts on Oracle Data Integrator 12c and Oracle GoldenGate 12c, we are very excited to share with you the great new features the 12c release brings to Oracle’s data integration solutions. And, fortunately we are not alone in this sentiment. Since the press announcement October 17th, which incorporates our customers' and experts' testimonials, we have seen positive comments in leading technology publications and social media as well. Here are some examples: In CIO and PCWorld you can find Joab Jackson’s article, Oracle Data Integrator 12c ready for real-time analysis, where wrote about the tight integration between Oracle Data Integrator and Oracle GoldenGate . He noted “Heeding the call from enterprise customers who clamor for more immediacy in their data-driven reports, Oracle has updated its data-integration software portfolio so that it can more rapidly deliver data to data warehouses and analysis applications.” Integration Developer News’ Vance McCarthy wrote the article Oracle Ships ‘Future Proofs’ Integration Tools for Traditional, Cloud, Big Data, Real-Time Projects and mentioned that “Oracle Data Integrator 12c and Oracle GoldenGate 12c sport a wide range of improvements to let devs more easily deliver data integration for cloud, analytics, big data and other new projects that leverage multiple datasets for business.“ InformationWeek’s Doug Henschen gave a great overview to several key features including the new flow-based UI in Oracle Data Integrator. Doug said “Oracle Data Integrator 12c introduces a complete makeover of the job-building experience, while real-time oriented GoldenGate 12c introduces performance gains “. In Database Trends and Applications’ article Oracle Strengthens Data Integration with Release of Oracle Data Integrator 12c and Oracle GoldenGate 12c highlighted the productivity aspect of the new solution with his remarks: “tight integration between Oracle Data Integrator 12c and Oracle GoldenGate 12c enables developers to leverage Oracle GoldenGate’s low overhead, real-time change data capture completely within the Oracle Data Integrator Studio without additional training”. We are also thrilled about what our customers and partners have to say about our products and the new release. And we are equally excited to share those perspectives with you in our upcoming launch video webcast on November 12th. SolarWorld Industries America’s Senior Database Manager, Russ Toyama will join our executives in our studio in Redwood Shores to discuss GoldenGate’s core benefits and the new release, while Surren Partharb, CTO of Strategic Technology Services for BT, and Mark Rittman, CTO of Rittman Mead, will provide their comments via the interviews conducted in the UK. This interactive panel discussion in the video webcast will unveil the new release with the expertise of our development executives and the great insight from our customers and partners. In addition, our product experts will be available online to answer chat questions. This is really a great opportunity to learn how Oracle's data integration offering has changed the integration and replication technology space with the new release, and established itself as the new leader. If you have not registered for this free event yet, you can do so via this link. We will run the live event at 8am PT/4pm GMT, followed by a replay of the event with live chat for Q&A  at 10am PT/6pm GMT. The replay will be available on-demand for those who register but cannot attend either session on November 12th. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Times New Roman","serif"; mso-fareast-font-family:"Times New Roman";}

    Read the article

  • Silverlight Cream for February 06, 2011 -- #1042

    - by Dave Campbell
    In this Issue: Mike Taulty, Timmy Kokke, Laurent Bugnion, Arik Poznanski, Deyan Ginev, Deborah Kurata(-2-), Johnny Tordgeman, Roy Dallal, Jaime Rodriguez, Samuel Jack(-2-), James Ashley. Above the Fold: Silverlight: "Customizing Silverlight properties for Visual Designers" Timmy Kokke WP7: "Back button press when using webbrowser control in WP7" Jaime Rodriguez Expression Blend: "Blend Bits 21–Importing from Photoshop & Illustrator…" Mike Taulty From SilverlightCream.com: Blend Bits 21–Importing from Photoshop & Illustrator… Mike Taulty is up to 21 episodes on his Blend Bits sequence now, and this one is about using Blend's import capability, such as a .psd file with all the layers intact. Customizing Silverlight properties for Visual Designers Timmy Kokke has part 1 of 2 parts on making your Silverlight control properties in design surfaces such as Visual Studio designer or Expression Blend. An error when installing MVVM Light templates for VS10 Express Laurent Bugnion has released a new version of MVVMLight that resolves a problem with VS2010 Express version of the templates... no problem with anything else. Reading RSS items on Windows Phone 7 Arik Poznanski has a post up about reading RSS on a WP7, but better yet, he also has code for a helper class that you can grab, plus explanation of wiring it up. Integrating your Windows Phone unit tests with MSBuild #4: The WP7 Unit Test Application Deyan Ginev has a post up about Telerik's WP7 test app that outputs test results in XML from the emulator so they can be integrated with the MSBuild log. Accessing Data in a Silverlight Application: EF I apprently missed this post by Deborah Kurata last week on bringing data into your Silverlight app via Entity Frameworks... good detailed tutorial in VB and C#. Updating Data in a Silverlight Application: EF In Deborah Kurata's latest post, she is continuing with Entity Frameworks by demonstrating updating to the database... full source code will be produced in a later post. Fun with Silverlight and SharePoint 2010 Ribbon Control - Part 2 - An In Depth Look At The Ribbon Control Johnny Tordgeman has Part 2 of his Silverlight and Sharepoint 2010 Ribbon up... taking a deep-dive into the ribbon... great explanation of the attributes, code included. Geographic Coordinates Systems Roy Dallal has some Geo code up that's not necessarily Silverlight, but very cool if you're doing any GIS programming... ya gotta know the coordinate systems! Back button press when using webbrowser control in WP7 Jaime Rodriguez has a post up discussing the much-lamented back-button action in the certification requirements and how to deal with that in a web browser app. Multiplayer-enabling my Windows Phone 7 game: Day 1 Samuel Jack challenged himself to build a WP7 game in 3 days... now he's challenging himself to make it multiplayer in 3 days... this first hour-to-hour post is research of networking and an azure server-side solution. Multiplayer-enabling my Windows Phone 7 game: Day 2–Building a UI with XPF Day 2 for Samuel Jack getting the multiplayer portion of his game working in 3 days.. this day involves getting up-to-speed with XPF. How to Hotwire your WP7 Phone Battery Did you realize if you run your WP7 battery completely down that you can't charge it? James Ashley reports that circumstance, and how he resolved it. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • ArchBeat Link-o-Rama Top 10 for November 4-10, 2012

    - by Bob Rhubart
    The Top 10 most popular items shared via the OTN ArchBeat Facebook Page for the week of November 4-10, 2012. OAM/OVD JVM Tuning | @FusionSecExpert Vinay from the Oracle Fusion Middleware Architecture Group (the very prolific A-Team) shares a process for analyzing and improving performance in Oracle Virtual Directory and Oracle Access Manager. Exploring Lambda Expressions for the Java Language and the JVM | Java Magazine In the latest //Java/Architect column in Java Magazine, Ben Evans, Martijn Verburg, and Trisha Gee explain how, "although Lambda expressions might seem unfamiliar to begin with, they're quite easy to pick up, and mastering them will be vital for writing applications that can take full advantage of modern multicore CPUs." SOA Galore: New Books for Technical Eyes Only Shake up up your technical skills with this trio of new technical books from community members covering SOA and BPM. Oracle Solaris 11.1 update focuses on database integration, cloud | Mark Fontecchio TechTarget editor Mark Fontecchio reports on the recent Oracle Solaris 11.1 release, with comments from IDC's Al Gillen. Solving Big Problems in Our 21st Century Information Society | Irving Wladawsky-Berger "I believe that the kind of extensive collaboration between the private sector, academia and government represented by the Internet revolution will be the way we will generally tackle big problems in the 21st century. Just as with the Internet, governments have a major role to play as the catalyst for many of the big projects that the private sector will then take forward and exploit. The need for high bandwidth, robust national broadband infrastructures is but one such example." — Irving Wladawsky-Berger ADF Mobile Custom Javasciprt – iFrame Injection | John Brunswick The ADF Mobile Framework provides a range of out of the box components to add within your AMX pages, according to John Brunswick. But what happens when "an out of the box component does not directly fulfill your development need? What options are available to extend your application interface?" John has an answer. Architects Matter: Making sense of the people who make sense of enterprise IT Why do architects matter? Oracle Enterprise Architect Eric Stephens suggests that you ask yourself this question the next time you take the elevator to the Oracle offices on the 45th floor of the Willis Tower in Chicago, Illinois (or any other skyscraper, for that matter). If you had to take the stairs to get to those offices, who would you blame? "You get the picture," he says. "Architecture is essential for any necessarily complex structure, be it a building or an enterprise." (Read the article...) Converting SSL certificate generated by a 3rd party to an Oracle Wallet | Paulo Albuquerque Oracle Fusion Middleware A-Team member Paulo Albuquerque shares "a workaround to get your private key, certificate and CA trusted certificates chain into Oracle Wallet." How Data and BPM are married to get the right information to the right people at the right time | Leon Smiers "Business Process Management…supports a large group of stakeholders within an organization, all with different needs," says Oracle ACE Leon Smiers. "End-to-end processes typically run across departments, stakeholders and applications, and can often have a long life-span. So how do organizations provide all stakeholders with the information they need?" Leon provides answers in this post. Updated Business Activity Monitoring (BAM) Class | Gary Barg Oracle SOA Team blogger Gary Barg has news for those interested in a skills upgrade. This updated Oracle University course "explains how to use Oracle BAM to monitor enterprise business activities across an enterprise in real time. You can measure your key performance indicators (KPIs), determine whether you are meeting service-level agreements (SLAs), and take corrective action in real time." Thought for the Day "For every complex problem, there is a solution that is simple, neat, and wrong." — H. L. Mencken (September 12, 1880 – January 29, 1956) Source: SoftwareQuotes.com

    Read the article

  • Interrupted Upgrade from 11.10 to 12.04

    - by Tamil
    My upgrade using alternative iso from 11.10 to 12.04 got interrupted and I had to hard restart my machine. Now I feel that everything is recovered except my already installed packages like vim. How do I backup my home folder for fresh installation of ubuntu? Following are the errors I'm facing I couldn't mark any package for re-installation in synaptic or remove and install too. output of sudo apt-get install vim Building dependency tree Reading state information... Done Package vim is not available, but is referred to by another package. This may mean that the package is missing, has been obsoleted, or is only available from another source E: Package 'vim' has no installation candidate If I try installing it from synaptic I get apache2.2-common: Package apache2.2-common has no available version, but exists in the database. This typically means that the package was mentioned in a dependency and never uploaded, has been obsoleted or is not available with the contents of sources.list my sources.list file # added by the release upgrader # deb cdrom:[Ubuntu 12.04.1 LTS _Precise Pangolin_ - Release amd64 (20120822.4)]/ precise main restricted # added by the release upgrader # # deb cdrom:[Ubuntu 12.04.1 LTS _Precise Pangolin_ - Release amd64 (20120822.4)]/ precise main restricted # deb cdrom:[Ubuntu 11.04 _Natty Narwhal_ - Release amd64 (20110427.1)]/ natty main restricted # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://archive.ubuntu.com/ubuntu precise main restricted deb-src http://archive.ubuntu.com/ubuntu precise main restricted ## Major bug fix updates produced after the final release of the ## distribution. deb http://archive.ubuntu.com/ubuntu precise-updates main restricted deb-src http://archive.ubuntu.com/ubuntu precise-updates main restricted ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://archive.ubuntu.com/ubuntu precise universe deb-src http://archive.ubuntu.com/ubuntu precise universe deb http://archive.ubuntu.com/ubuntu precise-updates universe deb-src http://archive.ubuntu.com/ubuntu precise-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://archive.ubuntu.com/ubuntu precise multiverse deb-src http://archive.ubuntu.com/ubuntu precise multiverse deb http://archive.ubuntu.com/ubuntu precise-updates multiverse deb-src http://archive.ubuntu.com/ubuntu precise-updates multiverse ## Uncomment the following two lines to add software from the 'backports' ## repository. ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. # deb http://us.archive.ubuntu.com/ubuntu/ natty-backports main restricted universe multiverse # deb-src http://us.archive.ubuntu.com/ubuntu/ natty-backports main restricted universe multiverse deb http://archive.ubuntu.com/ubuntu precise-security main restricted deb-src http://archive.ubuntu.com/ubuntu precise-security main restricted deb http://archive.ubuntu.com/ubuntu precise-security universe deb-src http://archive.ubuntu.com/ubuntu precise-security universe deb http://archive.ubuntu.com/ubuntu precise-security multiverse deb-src http://archive.ubuntu.com/ubuntu precise-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. deb http://archive.canonical.com/ubuntu precise partner # deb-src http://archive.canonical.com/ubuntu natty partner ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. deb http://extras.ubuntu.com/ubuntu precise main deb-src http://extras.ubuntu.com/ubuntu precise main # deb http://tamil.3758_gmail.com:[email protected]/free unstable main # disabled on upgrade to oneiric # deb http://debian.datastax.com/natty oneiric main # disabled on upgrade to oneiric sudo apt-get update Err http://archive.ubuntu.com precise InRelease Err http://archive.canonical.com precise InRelease Err http://archive.ubuntu.com precise-updates InRelease Err http://archive.ubuntu.com precise-security InRelease Err http://extras.ubuntu.com precise InRelease Err http://archive.canonical.com precise Release.gpg Unable to connect to 172.16.140.249:3142: Err http://archive.ubuntu.com precise Release.gpg Unable to connect to 172.16.140.249:3142: Err http://archive.ubuntu.com precise-updates Release.gpg Unable to connect to 172.16.140.249:3142: Err http://extras.ubuntu.com precise Release.gpg Unable to connect to 172.16.140.249:3142: Err http://archive.ubuntu.com precise-security Release.gpg Unable to connect to 172.16.140.249:3142: W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/precise/InRelease

    Read the article

  • Cutting-Edge Demos Coming to Collaborate12

    - by mvaughan
    Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:Calibri; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} By Kathy Miedema, Oracle Applications User Experience Are you building your Collaborate 2012 agenda? Leave room for a stop at the demogrounds while you’re in Las Vegas from April 22-26. In addition to several presentations on the Oracle user experience, the Applications User Experience (UX) team will be on the demo grounds with a new eye-tracking tool, as well as demos that showcase new user experience designs. Check out our cutting-edge technology, which we use to obtain feedback that helps improve the user experience of Oracle applications, and see what our next-generation designs are in the HCM and FIN user experiences.  Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:Calibri; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Photo by Martin Taylor – Oracle Applications User Experience An Apps UX team member demonstrates what happens during an eye-tracking test. The dots on the screen show were test participants were looking and how long they spent at each point in the page. The UX team will also be staffing an on-site lab at Collaborate. At on-site labs, conference participants can sign up to join customer feedback sessions on several different kinds of work flow designs, from HCM to FIN to CRM to mobile. The feedback UX team members collect helps inform and fine-tune the user experiences being designed for next-generation applications. At Collaborate12, for example, user experience designs around Help and organizational charts will be tested for usability. The Apps UX team brings on-site labs to many major user group conferences, including OpenWorld 2012 in October in San Francisco. Stay tuned to find out when our recruiters are ready to sign up participants, or leave a comment below to find out whether an on-site lab will be at your next conference. For information on the following presentations, which will be delivered by Apps UX team members, check the Usable Apps Events page. • The Fusion Applications User Experience: Transforming Work into Insight • Customizations Under the Covers – Making Fusion Applications Your Own • OAUG Fusion Middleware SIG (FMWSIG) • 18 Months with Fusion Applications – Stories From The Trenhes • PeopleTools Tips and Techniques

    Read the article

  • Do MORE with WebCenter

    - by Michael Snow
    WEBCAST THURSDAY!! 03/22/12 Do you need to lower costs? Raise Productivity? Foster Innovation? Improve Online Engagement? But you’re still stuck with Documentum? Step away from the ledge – there is hope – let us help you. Top 4 Content Imperatives · Lower Costs - Reduce labor, maintenance fees, storage and electrical consumption · Raise Productivity - Automation and integration, communication, findability · Foster Innovation - Enable collaboration, expertise location · Improve Online Engagement – enable user-driven, dynamic marketing initiatives With the coming technology wave we see four content imperatives. Every organization has had to reduce costs, cost cutting has become a way of life. Everyone is working three jobs as positions are eliminated. And so we have to reduce labor, reduce maintenance, and reduce money we are wasting on things like storing content that is redundant or no longer useful. We also, to fill that gap, need to raise productivity. Knowledge workers represent the fastest growing segment of the workforce, accounting for 40%-75% of the employees at organizations in sectors like financial services, life sciences, healthcare and retail.  What’s more, their wages total 18 percent of the United States GDP. And so we can’t afford information systems that don’t let our top performers be the best they can be. We look to automate the content processes, provide ways to integrate that content into our processes, provide communication to make decisions, and to make content more findable so people can make the right decision and move the process forward. And really to get ourselves out of the current financial status, we can only cut costs so far. We have to innovate out of economic tough times – to find new products and new markets. And to enable the innovation process, we have to enable collaboration and expertise location. So much of innovation is about building on innovations that have come before. To solve problems, we have to be able to find what our organization has already created. We find that problems we need to solve have already been solved if we can find the right document, the right person. So we have to provide systems that enable us to stand on the shoulders of our organization’s accomplishments. Good content drives great marketing. Online engagement is growing as an absolute necessity for modern growing marketing organizations that require the business users be enabled for dynamic marketing content creation, updates and targeted content creation and management. Unfortunately – if you are currently stuck with Documentum, you are really lacking in your Web Experience Management capabilities. Documentum previously used FatWire for web publishing. Now FatWire is part of Oracle. Oracle provides powerful web engagement capabilities: Increase sales and loyalty by optimizing online engagement Create, manage and moderate contextually relevant, targeted and interactive online experiences Optimize customer engagement across, web, mobile and social channels Manage large scale multichannel global online presence with integration to enterprise applications Enable business users to control their content and make their own updates Publish content from native files – enable navigation of project documents, procedures, policy information Enable content display and updates from existing web applications – one click to drag and drop content management functionality So you get the ability to self-publish information and make it navigable, to move the process of publishing from IT to business users, and the ability to address a whole new area of user engagement with web experience management. So… if you are still stuck with Documentum and don’t know what to do – contact us – not only will Oracle help you step away from the ledge, but also with the MoveOff Documentum program, we are offering you a way – trade-in your Documentum licenses for a 100% credit on Oracle WebCenter. How’s that for a nice bonus? It’s time to stop maintaining Documentum, and to start innovating with Oracle WebCenter. Learn More Here! To learn more about what Oracle WebCenter can offer you today – join us for a webcast – your eyes will be opened to all that’s possible. Do More with WebCenter: Extend Beyond Content Management

    Read the article

  • Release 17 is here!

    - by Cheryl
    Our training development team has been busy updating courses to keep pace with the new release of CRM On Demand. Release 17 is here! And I heard recently that it's one of our biggest releases ever. A lot of new features and functionality for you to take advantage of - too much for me to cover in this blog post. But, I thought I'd tell you about a few of my favorites - be sure to take a look at the What's New in Release 17 recording to see the full list, though...because I'm only going to touch on a few. Create your own look - okay, I'm starting with the fun stuff. But, there is a new customizable themes feature so that you can change the look of the application; colors, logo, the shape of the tabs. And it's really easy. There's also a whole new library of ready-made themes for you to pick from if you just want to go with one of those. Use this new feature to match the look of your company logo and color scheme. Or blaze new trails. You can create the look for the whole company, or a different look for each CRM On Demand role. This might especially come in handy if you're using the Partner Relationship Management (PRM) capabilities of CRM On Demand - you can create themes for your partner-facing roles to provide branded partner portals. Speaking of PRM - there are enhancements in this release to help companies better manage their partner relationships. A new Deal Registration object, which is separate from the Opportunity record, and better Special Pricing Request and Marketing Development Fund Request processes, give a lot more flexibility in how companies can build and manage their relationships with partners. Some new options for Forecasts in in Release 17, too. You can now have more than one type of forecast generated each forecast period. For example, you might need to see a forecast of the total opportunity revenue for your sales team, as well as on that breaks down revenue by product. The forecast definition now lets you do that. Other options allow you to make submitting forecasts easier, split opportunity revenue across the team and forecast that split appropriately. And - look for the new Forecast subject area in Answers, for building custom forecast reports. Ever wish you could use Workflow Rules to automatically reassign leads if they haven't been followed up on...or to email a manager if the status of a service request isn't changed after a specified period of time? Then check out the new Wait action for workflows. I think you'll be happy. Ok, enough for today. There is a lot to Release 17 that I didn't mention - a lot has been added for our Life Science industry edition, some new data visibility options, a new Data Loader tool, and more. Stay tuned for more blog posts about these and other Release 17 features in the coming weeks. In the meantime, don't forget about all of the resources we have for you to learn more (see my Learning About Release 17 blog post for details).

    Read the article

  • First-Time GLSL Shadow Mapping Problems

    - by Locke
    I'm working on building out a 2.5D engine and having massive problems getting my shadows working. I'm at a point where I'm VERY close. So, let's see a picture to see what I have: As you can see above, the image has lighting -- but the shadow map is displaying incorrectly. The shadow map is shown in the bottom left hand side of the screen as a normal 2D texture, so we can see what it looks like at any given time. If you notice, it appears that the shadows are generating backwards in the wrong direction -- I think. But the problem is a little more deep -- I'm just plotting the shadow onto the screen, which I know is wrong -- I'm ignoring the actual test to see if we NEED to show a shadow. The incoming parameters all appear to be correct -- so there has to be something wrong with my shader code somewhere. Here's what my code looks like: VERTEX: uniform mat4 LightModelViewProjectionMatrix; varying vec3 Normal; // The eye-space normal of the current vertex. varying vec4 LightCoordinate; // The texture coordinate of the light of the current vertex. varying vec3 LightDirection; // The eye-space direction of the light. void main() { Normal = normalize(gl_NormalMatrix * gl_Normal); LightDirection = normalize(gl_NormalMatrix * gl_LightSource[0].position.xyz); LightCoordinate = LightModelViewProjectionMatrix * gl_Vertex; LightCoordinate.xy = ( LightCoordinate.xy * 0.5 ) + 0.5; gl_Position = ftransform(); gl_TexCoord[0] = gl_MultiTexCoord0; } FRAGMENT: uniform sampler2D DiffuseMap; uniform sampler2D ShadowMap; varying vec3 Normal; // The eye-space normal of the current vertex. varying vec4 LightCoordinate; // The texture coordinate of the light of the current vertex. varying vec3 LightDirection; // The eye-space direction of the light. void main() { vec4 Texel = texture2D(DiffuseMap, vec2(gl_TexCoord[0])); // Directional lighting //Build ambient lighting vec4 AmbientElement = gl_LightSource[0].ambient; //Build diffuse lighting float Lambert = max(dot(Normal, LightDirection), 0.0); //max(abs(dot(Normal, LightDirection)), 0.0); vec4 DiffuseElement = ( gl_LightSource[0].diffuse * Lambert ); vec4 LightingColor = ( DiffuseElement + AmbientElement ); LightingColor.r = min(LightingColor.r, 1.0); LightingColor.g = min(LightingColor.g, 1.0); LightingColor.b = min(LightingColor.b, 1.0); LightingColor.a = min(LightingColor.a, 1.0); LightingColor *= Texel; //Everything up to this point is PERFECT // Shadow mapping // ------------------------------ vec4 ShadowCoordinate = LightCoordinate / LightCoordinate.w; float DistanceFromLight = texture2D( ShadowMap, ShadowCoordinate.st ).z; float DepthBias = 0.001; float ShadowFactor = 1.0; if( LightCoordinate.w > 0.0 ) { ShadowFactor = DistanceFromLight < ( ShadowCoordinate.z + DepthBias ) ? 0.5 : 1.0; } LightingColor.rgb *= ShadowFactor; //gl_FragColor = LightingColor; //Yes, I know this is wrong, but the line above (gl_FragColor = LightingColor;) produces the wrong effect gl_FragColor = LightingColor * texture2D( ShadowMap, ShadowCoordinate.st ); } I wanted to make sure the coordinates were correct for the shadow map -- so that's why you see it applied to the image as it is below. But the depth for each point seems to be wrong -- the shadows SHOULD be opposite (look at how the image is -- the shaded areas from normal lighting are facing the opposite direction of the shadows). Maybe my matrices are bad or something going in? They're isolated and appear to be correct -- nothing else is going in unusual. When I view from the light's view and get the MVP matrices for it, they're correct. EDIT: Added an image so you can see what happens when I do the correct command at the end of the GLSL: That's the image when the last line is just glFragColor = LightingColor; Maybe someone has some idea of what I screwed up?

    Read the article

  • Anatomy of a .NET Assembly - Signature encodings

    - by Simon Cooper
    If you've just joined this series, I highly recommend you read the previous posts in this series, starting here, or at least these posts, covering the CLR metadata tables. Before we look at custom attribute encoding, we first need to have a brief look at how signatures are encoded in an assembly in general. Signature types There are several types of signatures in an assembly, all of which share a common base representation, and are all stored as binary blobs in the #Blob heap, referenced by an offset from various metadata tables. The types of signatures are: Method definition and method reference signatures. Field signatures Property signatures Method local variables. These are referenced from the StandAloneSig table, which is then referenced by method body headers. Generic type specifications. These represent a particular instantiation of a generic type. Generic method specifications. Similarly, these represent a particular instantiation of a generic method. All these signatures share the same underlying mechanism to represent a type Representing a type All metadata signatures are based around the ELEMENT_TYPE structure. This assigns a number to each 'built-in' type in the framework; for example, Uint16 is 0x07, String is 0x0e, and Object is 0x1c. Byte codes are also used to indicate SzArrays, multi-dimensional arrays, custom types, and generic type and method variables. However, these require some further information. Firstly, custom types (ie not one of the built-in types). These require you to specify the 4-byte TypeDefOrRef coded token after the CLASS (0x12) or VALUETYPE (0x11) element type. This 4-byte value is stored in a compressed format before being written out to disk (for more excruciating details, you can refer to the CLI specification). SzArrays simply have the array item type after the SZARRAY byte (0x1d). Multidimensional arrays follow the ARRAY element type with a series of compressed integers indicating the number of dimensions, and the size and lower bound of each dimension. Generic variables are simply followed by the index of the generic variable they refer to. There are other additions as well, for example, a specific byte value indicates a method parameter passed by reference (BYREF), and other values indicating custom modifiers. Some examples... To demonstrate, here's a few examples and what the resulting blobs in the #Blob heap will look like. Each name in capitals corresponds to a particular byte value in the ELEMENT_TYPE or CALLCONV structure, and coded tokens to custom types are represented by the type name in curly brackets. A simple field: int intField; FIELD I4 A field of an array of a generic type parameter (assuming T is the first generic parameter of the containing type): T[] genArrayField FIELD SZARRAY VAR 0 An instance method signature (note how the number of parameters does not include the return type): instance string MyMethod(MyType, int&, bool[][]); HASTHIS DEFAULT 3 STRING CLASS {MyType} BYREF I4 SZARRAY SZARRAY BOOLEAN A generic type instantiation: MyGenericType<MyType, MyStruct> GENERICINST CLASS {MyGenericType} 2 CLASS {MyType} VALUETYPE {MyStruct} For more complicated examples, in the following C# type declaration: GenericType<T> : GenericBaseType<object[], T, GenericType<T>> { ... } the Extends field of the TypeDef for GenericType will point to a TypeSpec with the following blob: GENERICINST CLASS {GenericBaseType} 3 SZARRAY OBJECT VAR 0 GENERICINST CLASS {GenericType} 1 VAR 0 And a static generic method signature (generic parameters on types are referenced using VAR, generic parameters on methods using MVAR): TResult[] GenericMethod<TInput, TResult>( TInput, System.Converter<TInput, TOutput>); GENERIC 2 2 SZARRAY MVAR 1 MVAR 0 GENERICINST CLASS {System.Converter} 2 MVAR 0 MVAR 1 As you can see, complicated signatures are recursively built up out of quite simple building blocks to represent all the possible variations in a .NET assembly. Now we've looked at the basics of normal method signatures, in my next post I'll look at custom attribute application signatures, and how they are different to normal signatures.

    Read the article

< Previous Page | 318 319 320 321 322 323 324 325 326 327 328 329  | Next Page >