Search Results

Search found 5254 results on 211 pages for 'concept analysis'.

Page 66/211 | < Previous Page | 62 63 64 65 66 67 68 69 70 71 72 73  | Next Page >

  • Excel 2010 & SSAS – Search Dimension Members

    - by Davide Mauri
    Today I’ve connected my Excel 2010 to an Analysis Services 2008 Cube and I got a very nice (and unexpected) surprise! It’s now finally possibly to search and filter Dimension Members directly from the combo box window: As you can easily imagine, for medium/big dimensions is really – really – really useful! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Manchester UG Presentation Video

    In July I was invited to speak at the UK SQL Server UG event in Manchester.  I spoke about Excel being a good data mining client.  I was a little rushed at the end as Chris Testa-ONeill told me I had only 5 minutes to go when I had only been talking for 10 minutes.  Apparently I have a reputation for running over my time allocation.  At the event we also had a product demo from SQL Sentry around their BI monitoring dashboard solution.  This includes SSIS but the main thrust was SSAS Then came Chris with a look at Analysis Services.  If you have never heard Chris talk then take the opportunity now, he is a top class presenter and I am often found sat at the back of his classes. Here is the video link

    Read the article

  • Microsoft’s 22tracks Music Service now Available in All Browsers

    - by Akemi Iwaya
    Are you tired of listening to the same old music and looking for something new to listen to? Then 22tracks from Microsoft is definitely worth a look! This online music service is available in your favorite browser, does not require an account to use, and lets you listen to music from multiple international sources! If you are curious about 22tracks, then the following excerpt and video sum up the service very nicely. From the blog post: The concept behind 22tracks is simple: 22 local top DJs from cities like Amsterdam, Brussels, London and Paris share their genre’s 22 hottest tracks of the moment. Each city boosts its own team of specialized DJs bringing you the newest tracks in their genre. When you get ready to select (or change to) another set of tracks, just click on the desired city at the top of the browser window, then click on the appropriate set from the drop-down list. 22tracks Homepage 22tracks and Internet Explorer team up to bring you a completely new online music experience [22tracks Blog] 22tracks about [YouTube] [via BetaNews and The Next Web]

    Read the article

  • Flash/Flex/Air and iOS

    - by David Archer
    I'm just a little confused with all of the news recently regarding the cancellation of mobile flash, so was hoping for a little help. I've had a search through and can't find the answers to these questions, so any help would be great. First up, I'm looking to create a game in Flash first, to test whether the concept works as a fun game (on Newgrounds/Kongregate/Facebook etc.). Would it be best to use Flash CS5.5, or Flash Builder? Secondly, with mobile flash now being discontinued by Adobe, could I still port this game over to iOS through the Flash platform, or would it be better at that point to re-write the whole game using Objective C? (NOTE: I'm not an Objective C developer, but am instead a Javascript and Actionscript dev). Any help would be great. Thanks!

    Read the article

  • Fast Fashion Freshness

    - by David Dorf
    Fashion retailers such as H&M, Zara, and Wet Seal have perfected the fast fashion retailing model. The concept requires no replenishment in order to maintain assortment freshness and to create a sense of urgency for the consumer to purchase now. However, maintaining assortment freshness results in high product turnover, making markdown optimization a necessity. Wet Seal, for instance, needed to move from ad-hoc markdowns and dealing with surplus inventory to handling markdowns methodically across 8,000 SKUs with only 12-15 week lifecycle (from DC receipt to exit). By optimizing and automating markdowns, Wet Seal is reaching their goal of assortment freshness, which in turn increases sales. If you're interested in learning more, register for a free webinar occurring on May 13th featuring Join Daniel Ryu, Vice President of Planning and Allocation at Wet Seal. He'll be discussing how the fast fashion retailer maintains their goal of assortment freshness.

    Read the article

  • New Rules of Retail

    - by David Dorf
    I've been on vacation and preparing for Crosstalk, so its been a while since I've posted. I've seen the agenda, and I can assure you Crosstalk will be lots of fun. In addition to hearing from lots of retailers, we'll also be doing a little bowling and racing on the track. I'll be around for the sessions, the ORUG meetings, and our Customer Advisory Board so please be sure to say hello. I also just completed a white paper based on a previous blog posting which in turn was based on learnings from reading What Would Google Do? For each of Jarvis' ten rules, I discuss the concept in the context of retail and provide real-world examples. No mention of products or sales pitches at all. You can download the paper here. It will put you in the right frame of mind for hearing Jeff Jarvis speak at Crosstalk. For those that can't make it, I'll post some highlights afterwards.

    Read the article

  • How to fix “Collection was modified; enumeration operation may not execute. “when using Copy.asmx in SharePoint2010

    - by ybbest
    Problem: In my current project, we use copy.asmx web services in SharePoint2010 to upload a document to a document library. In one of the document library, when uploading a document with a choice field, it blows up and the error message is Collection was modified; enumeration operation may not execute. Analysis: After some research , we found out the problem is that we have 2 content types that both have a field called Document Type , although the internal name are different but the display name are exactly the same and this cause the problem. I am still not too sure why, it could possibly a SharePoint bug. Solution: After rename one of the field display name to a different name, it works like a charm. You can download source code with the problem here and source code without the problem here.

    Read the article

  • How to fix “Microsoft SharePoint is not supported with version 4.0.30319.225 of the Microsoft .Net Runtime” in PowerGUI

    - by ybbest
    Today, when I try to run some PowerShell command against SharePoint in PowerGUI , I encounter some error message as below: Problem: Remove-SPSite : Microsoft SharePoint is not supported with version 4.0.30319.225 of the Microsoft .Net Runtime. At C:\SiteCreation.ps1:37 char:14 + CategoryInfo : InvalidData: (Microsoft.Share…mdletRemoveSite:SPCmdletRemoveSite) [Remove-SPSite], PlatformNotSupportedException Analysis: The error message is pretty clear that PowerGUI try to run the PowerShell command under .Net version 4.0 which is not supported by SharePoint2010, SharePoint2010 only support .Net 3.5.So how can I change the settings so that PowerShell does run under .Net3.5 in PowerGui? The solution is pretty easy. Solution: 1. Open your windows explorer and navigate to C:\Program Files (x86)\PowerGUI\ and open the configuration file ScriptEditor.exe.config. 2. Change the supportedRuntime version under Startup settings by removing the version=”v4.0″ as below From To   3. Restart your PowerGUI and rerun your script. It works like a charm.

    Read the article

  • Unity 3D game idea for a fun and teaching game

    - by rasheeda
    I have been brainstorming for months now on writing a unity 3D game for my final year computer science project. I have been learning unity for sometime now but comming up with a concept is quite difficult than i thought. The game has to be really fun and also educational, one that the school and community can benefit from. I am thinking about a third person game. where the player runs round in an enviroment, picks up coins and earn points. This alone wouldn't earn me points and I have been trying to find new ideas of my own and all over the net but to no avail. Hopefully you guys can help me out with an idea, and how to make the lecturers appreciate the game and also make kids wanna play play it for a reason. Thanks.

    Read the article

  • SQL SERVER – SSMS: Memory Usage By Memory Optimized Objects Report

    - by Pinal Dave
    At conferences and at speaking engagements at the local UG, there is one question that keeps on coming which I wish were never asked. The question around, “Why is SQL Server using up all the memory and not releasing even when idle?” Well, the answer can be long and with the release of SQL Server 2014, this got even more complicated. This release of SQL Server 2014 has the option of introducing In-Memory OLTP which is completely new concept and our dependency on memory has increased multifold. In reality, nothing much changes but we have memory optimized objects (Tables and Stored Procedures) additional which are residing completely in memory and improving performance. As a DBA, it is humanly impossible to get a hang of all the innovations and the new features introduced in the next version. So today’s blog is around the report added to SSMS which gives a high level view of this new feature addition. This reports is available only from SQL Server 2014 onwards because the feature was introduced in SQL Server 2014. Earlier versions of SQL Server Management Studio would not show the report in the list. If we try to launch the report on the database which is not having In-Memory File group defined, then we would see the message in report. To demonstrate, I have created new fresh database called MemoryOptimizedDB with no special file group. Here is the query used to identify whether a database has memory-optimized file group or not. SELECT TOP(1) 1 FROM sys.filegroups FG WHERE FG.[type] = 'FX' Once we add filegroup using below command, we would see different version of report. USE [master] GO ALTER DATABASE [MemoryOptimizedDB] ADD FILEGROUP [IMO_FG] CONTAINS MEMORY_OPTIMIZED_DATA GO The report is still empty because we have not defined any Memory Optimized table in the database.  Total allocated size is shown as 0 MB. Now, let’s add the folder location into the filegroup and also created few in-memory tables. We have used the nomenclature of IMO to denote “InMemory Optimized” objects. USE [master] GO ALTER DATABASE [MemoryOptimizedDB] ADD FILE ( NAME = N'MemoryOptimizedDB_IMO', FILENAME = N'E:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\DATA\MemoryOptimizedDB_IMO') TO FILEGROUP [IMO_FG] GO You may have to change the path based on your SQL Server configuration. Below is the script to create the table. USE MemoryOptimizedDB GO --Drop table if it already exists. IF OBJECT_ID('dbo.SQLAuthority','U') IS NOT NULL DROP TABLE dbo.SQLAuthority GO CREATE TABLE dbo.SQLAuthority ( ID INT IDENTITY NOT NULL, Name CHAR(500)  COLLATE Latin1_General_100_BIN2 NOT NULL DEFAULT 'Pinal', CONSTRAINT PK_SQLAuthority_ID PRIMARY KEY NONCLUSTERED (ID), INDEX hash_index_sample_memoryoptimizedtable_c2 HASH (Name) WITH (BUCKET_COUNT = 131072) ) WITH (MEMORY_OPTIMIZED = ON, DURABILITY = SCHEMA_AND_DATA) GO As soon as above script is executed, table and index both are created. If we run the report again, we would see something like below. Notice that table memory is zero but index is using memory. This is due to the fact that hash index needs memory to manage the buckets created. So even if table is empty, index would consume memory. More about the internals of how In-Memory indexes and tables work will be reserved for future posts. Now, use below script to populate the table with 10000 rows INSERT INTO SQLAuthority VALUES (DEFAULT) GO 10000 Here is the same report after inserting 1000 rows into our InMemory table.    There are total three sections in the whole report. Total Memory consumed by In-Memory Objects Pie chart showing memory distribution based on type of consumer – table, index and system. Details of memory usage by each table. The information about all three is taken from one single DMV, sys.dm_db_xtp_table_memory_stats This DMV contains memory usage statistics for both user and system In-Memory tables. If we query the DMV and look at data, we can easily notice that the system tables have negative object IDs.  So, to look at user table memory usage, below is the over-simplified version of query. USE MemoryOptimizedDB GO SELECT OBJECT_NAME(OBJECT_ID), * FROM sys.dm_db_xtp_table_memory_stats WHERE OBJECT_ID > 0 GO This report would help DBA to identify which in-memory object taking lot of memory which can be used as a pointer for designing solution. I am sure in future we will discuss at lengths the whole concept of In-Memory tables in detail over this blog. To read more about In-Memory OLTP, have a look at In-Memory OLTP Series at Balmukund’s Blog. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL Tagged: SQL Memory, SQL Reports

    Read the article

  • Troubleshooting Application Timeouts in SQL Server

    - by Tara Kizer
    I recently received the following email from a blog reader: "We are having an OLTP database instance, using SQL Server 2005 with little to moderate traffic (10-20 requests/min). There are also bulk imports that occur at regular intervals in this DB and the import duration ranges between 10secs to 1 min, depending on the data size. Intermittently (2-3 times in a week), we face an issue, where queries get timed out (default of 30 secs set in application). On analyzing, we found two stored procedures, having queries with multiple table joins inside them of taking a long time (5-10 mins) in getting executed, when ideally the execution duration ranges between 5-10 secs. Execution plan of the same displayed Clustered Index Scan happening instead of Clustered Index Seek. All required Indexes are found to be present and Index fragmentation is also minimal as we Rebuild Indexes regularly alongwith Updating Statistics. With no other alternate options occuring to us, we restarted SQL server and thereafter the performance was back on track. But sometimes it was still giving timeout errors for some hits and so we also restarted IIS and that stopped the problem as of now." Rather than respond directly to the blog reader, I thought it would be more interesting to share my thoughts on this issue in a blog. There are a few things that I can think of that could cause abnormal timeouts: Blocking Bad plan in cache Outdated statistics Hardware bottleneck To determine if blocking is the issue, we can easily run sp_who/sp_who2 or a query directly on sysprocesses (select * from master..sysprocesses where blocking <> 0).  If blocking is present and consistent, then you'll need to determine whether or not to kill the parent blocking process.  Killing a process will cause the transaction to rollback, so you need to proceed with caution.  Killing the parent blocking process is only a temporary solution, so you'll need to do more thorough analysis to figure out why the blocking was present.  You should look into missing indexes and perhaps consider changing the database's isolation level to READ_COMMITTED_SNAPSHOT. The blog reader mentions that the execution plan shows a clustered index scan when a clustered index seek is normal for the stored procedure.  A clustered index scan might have been chosen either because that is what is in cache already or because of out of date statistics.  The blog reader mentions that bulk imports occur at regular intervals, so outdated statistics is definitely something that could cause this issue.  The blog reader may need to update statistics after imports are done if the imports are changing a lot of data (greater than 10%).  If the statistics are good, then the query optimizer might have chosen to scan rather than seek in a previous execution because the scan was determined to be less costly due to the value of an input parameter.  If this parameter value is rare, then its execution plan in cache is what we call a bad plan.  You want the best plan in cache for the most frequent parameter values.  If a bad plan is a recurring problem on your system, then you should consider rewriting the stored procedure.  You might want to break up the code into multiple stored procedures so that each can have a different execution plan in cache. To remove a bad plan from cache, you can recompile the stored procedure.  An alternative method is to run DBCC FREEPROCACHE which drops the procedure cache.  It is better to recompile stored procedures rather than dropping the procedure cache as dropping the procedure cache affects all plans in cache rather than just the ones that were bad, so there will be a temporary performance penalty until the plans are loaded into cache again. To determine if there is a hardware bottleneck occurring such as slow I/O or high CPU utilization, you will need to run Performance Monitor on the database server.  Hopefully you already have a baseline of the server so you know what is normal and what is not.  Be on the lookout for I/O requests taking longer than 12 milliseconds and CPU utilization over 90%.  The servers that I support typically are under 30% CPU utilization, but your baseline could be higher and be within a normal range. If restarting the SQL Server service fixes the problem, then the problem was most likely due to blocking or a bad plan in the procedure cache.  Rather than restarting the SQL Server service, which causes downtime, the blog reader should instead analyze the above mentioned things.  Proceed with caution when restarting the SQL Server service as all transactions that have not completed will be rolled back at startup.  This crash recovery process could take longer than normal if there was a long-running transaction running when the service was stopped.  Until the crash recovery process is completed on the database, it is unavailable to your applications. If restarting IIS fixes the problem, then the problem might not have been inside SQL Server.  Prior to taking this step, you should do analysis of the above mentioned things. If you can think of other reasons why the blog reader is facing this issue a few times a week, I'd love to hear your thoughts via a blog comment.

    Read the article

  • New Supply Chain, S&OP, & TPM Analyst Reports from Gartner, IDC Now Available

    - by Mike Liebson
    Check out these analyst reports Oracle has recently made available for customers and partners on Oracle.com: Gartner:  MarketScope for Stage 3 Sales and Operations Planning  -  Gartner lead supply chain planning analyst, Tim Payne, discusses the evolving definition of S&OP, the Gartner S&OP maturity model, and recommendations for selecting S&OP technology solutions. Gartner: Vendor Panorama for Trade Promotion Management in Consumer Goods  -  Consumer goods analyst, Dale Hagemeyer, presents an overview of the TPM market, followed by an analysis of vendor offerings. IDC:  Perspective: Oracle OpenWorld 2012 — Supply Chain as a Focus  -  Supply chain analyst, Simon Ellis, discusses supply chain highlights from the October OpenWorld conference. Value Chain Planning highlights include the VCP product roadmap and demand sensing presentations by Electronic Arts (Demantra) and Sony (Demand Signal Repository). For a complete set of analyst reports, visit here.

    Read the article

  • How to fix “Microsoft SharePoint is not supported with version 4.0.30319.225 of the Microsoft .Net Runtime” in PowerGUI

    - by ybbest
    Today, when I try to run some PowerShell command against SharePoint in PowerGUI , I encounter some error message as below: Problem: Remove-SPSite : Microsoft SharePoint is not supported with version 4.0.30319.225 of the Microsoft .Net Runtime. At C:\SiteCreation.ps1:37 char:14 + CategoryInfo : InvalidData: (Microsoft.Share…mdletRemoveSite:SPCmdletRemoveSite) [Remove-SPSite], PlatformNotSupportedException Analysis: The error message is pretty clear that PowerGUI try to run the PowerShell command under .Net version 4.0 which is not supported by SharePoint2010, SharePoint2010 only support .Net 3.5.So how can I change the settings so that PowerShell does run under .Net3.5 in PowerGui? The solution is pretty easy. Solution: 1. Open your windows explorer and navigate to C:\Program Files (x86)\PowerGUI\ and open the configuration file ScriptEditor.exe.config. 2. Change the supportedRuntime version under Startup settings by removing the version=”v4.0″ as below From To   3. Restart your PowerGUI and rerun your script. It works like a charm.

    Read the article

  • What are the strategies to become a good open source developer?

    - by u3050
    I always hear that involving with open source projects is good for career and the more (good) open source you release, the closer you will be to getting your dream job before you've even had an interview.I am expert in Java and I am trying to become fluent in Scala. I always think about getting involved in open source development in Java/Scala but the following confusions stopping me to do so. How/where do I start in open source development projects in GitHub etc? Or How to find active/busy open source development projects? How to find an area where improvement is required or enhancement required in such projects? It looks too complex in the first analysis or its pretty hard to find such opportunities. What are the common strategies to follow if I want to become hobbyist/free time open source developer?People who have experience in open source development please share your learnings/expertise from scratch.Thanks in advance.

    Read the article

  • Interview with Java Champion Matjaz B. Juric on Cloud Computing, SOA, and Java EE 6

    - by [email protected]
    In a Java Champion interview Matjaz Juric of Slovenia, head of the Cloud Computing and SOA Competence Centre at the University of Maribor, and professor at the University of Ljubljana, shares insights about cloud computing, SOA and Java EE 6. Juric has worked on performance analysis and optimization of RMI-IIOP, as well as being a member of the BPEL Advisory Board, and a Java mentor and trainer.Regarding BPEL he remarks, "Probably the most important thing to understand is what should be programmed in Java and what should be programmed in BPEL. There is still some confusion. BPEL is for the process logic, while Java is for functionalities. Together, BPEL and Java form a strong alliance and enable faster development and maintenance of enterprise applications and their integrations. On the other hand, the integration between Java and BPEL could be improved. There have been different approaches, including Java snippets. I would like to see an XML data type in Java, without all the hassles with JAXB, mappings, or DOM." Read the rest of the article here.

    Read the article

  • Finding a problem in some task [closed]

    - by nagisa
    Recently I competed in nation wide programming contest finals. Not unexpectedly all problems were algorithmic. I lost (40 points out of 600. Winner got ~300). I know why I lost very well - I don't know how to find actual problem in those obfuscated tasks which are life-blood of every competition. I think that being self-taught and not well versed in algorithms got me too. As side effect of learning things myself I know how to search for information, however all I could find are couple questions about learning algorithms. For now I put Python Algorithms: Mastering Basic Algorithms in the Python Language and Analysis of Algorithms which I found in those questions to my "to read" list. That leaves my first problem of not knowing how to find a problem unsolved. Will that ability come with learning algorithms? Or does it need some special attention? Any suggestions are welcomed.

    Read the article

  • What’s Your Tax Strategy? Automate the Tax Transfer Pricing Process!

    - by tobyehatch
    Does your business operate in multiple countries? Well, whether you like it or not, many local and international tax authorities inspect your tax strategy.  Legal, effective tax planning is perceived as a “moral” issue. CEOs are being asked to testify on their process of tax transfer pricing between multinational legal entities.  Marc Seewald, Senior Director of Product Management for EPM Applications specializing in all tax subjects and Product Manager for Oracle Hyperion Tax Provisioning, and Bart Stoehr, Senior Director of Product Strategy for Oracle Hyperion Profitability and Cost Management joined me for a discussion/podcast on this interesting subject.  So what exactly is “tax transfer pricing”? Marc defined it this way. “Tax transfer pricing is a profit allocation methodology required to be used by multinational corporations. Specifically, the ultimate goal of the transfer pricing is to ensure that the global multinational pays their fair share of income tax in each of their local markets. Specifically, it prevents companies from unfairly moving profit from ‘high tax’ countries to ‘low tax’ countries.” According to Marc, in today’s global economy, profitability can be significantly impacted by goods and services exchanged between the related divisions within a single multinational company.  To ensure that these cost allocations are done fairly, there are rules that govern the process. These rules ensure that intercompany allocations fairly represent the actual nature of the businesses activity- as if two divisions were unrelated - and provide a clear audit trail of how the costs have been allocated to prove that allocations fall within reasonable ranges.  What are the repercussions of improper tax transfer pricing? How important is it? Tax transfer pricing allocations can materially impact the amount of overall corporate income taxes paid by a company worldwide, in some cases by hundreds of millions of dollars!  Since so much tax revenue is at stake, revenue agencies like the IRS, and international regulatory bodies like the Organization for Economic Cooperation and Development (OECD) are pushing to reform and clarify reporting for tax transfer pricing. Most recently the OECD announced an “Action Plan for Base Erosion and Profit Shifting”. As Marc explained, the times are changing and companies need to be responsive to this issue. “It feels like every other week there is another company being accused of avoiding taxes,” said Marc. Most recently, Caterpillar was accused of avoiding billions of dollars in taxes. In the last couple of years, Apple, GE, Ikea, and Starbucks, have all been accused of tax avoidance. It’s imperative that companies like these have a clear and auditable tax transfer process that enables them to justify tax transfer pricing allocations and avoid steep penalties and bad publicity. Transparency and efficiency are what is needed when it comes to the tax transfer pricing process. Bart explained that tax transfer pricing is driving a deeper inspection of profit recognition specifically focused on the tax element of profit.  However, allocations needed to support tax profitability are nearly identical in process to allocations taking place in other parts of the finance organization. For example, the methods and processes necessary to arrive at tax profitability by legal entity are no different than those used to arrive at fully loaded profitability for a product line. In fact, there is a great opportunity for alignment across these two different functions.So it seems that tax transfer pricing should be reflected in profitability in general. Bart agreed and told us more about some of the critical sub-processes of an overall tax transfer pricing process within the Oracle solution for tax transfer pricing.  “First, there is a ton of data preparation, enrichment and pre-allocation data analysis that is managed in the Oracle Hyperion solution. This serves as the “data staging” to the next, critical sub-processes.  From here, we leverage the Oracle EPM platform’s ability to re-use dimensions and legal entity driver data and financial data with Oracle Hyperion Profitability and Cost Management (HPCM).  Within HPCM, we manage the driver data, define the legal entity to legal entity allocation rules (like cost plus), and have the option to test out multiple, simultaneous tax transfer pricing what-if scenarios.  Once processed, a tax expert can evaluate the effectiveness of any one scenario result versus another via a variance analysis configured with HPCM’s pre-packaged reporting capability known as Oracle Hyperion SmartView for Office.”   Further, Bart explained that the ability to visibly demonstrate how a cost or revenue has been allocated is really helpful and auditable.  “HPCM’s Traceability Maps are that visual representation of all allocation flows that have been executed and is the tax transfer analyst’s best friend in maintaining clear documentation for tax transfer pricing audits. Simply click and drill as you inspect the chain of allocation definitions and results. Once final, the post-allocated tax data can be compared to the GL to create invoices and journal entries for posting to your GL system of choice.  Of course, there is a framework for overall governance of the journal entries, allocation percentages, and reporting to include necessary approvals.” Lastly, Marc explained that the key value in using the Oracle Hyperion solution for tax transfer pricing is that it keeps everything in alignment in one single place. Specifically, Oracle Hyperion effectively becomes the single book of record for the GAAP, management, and the tax set of books. There are many benefits to having one source of the truth. These include EFFICIENCY, CONTROLS and TRANSPARENCY.So, what’s your tax strategy? Why not automate the tax transfer pricing process!To listen to the entire podcast, click here.To learn more about Oracle Hyperion Profitability and Cost Management (HPCM), click here.

    Read the article

  • Connect Microsoft Excel To SQL Azure Database

    A number of people have found my post about getting started with SQL Azure pretty useful. But, it's all worthless if it doesn't add up to user value. Database are like potential energy in physics-it's a promise that something could be put in motion. Users actually making decisions based on analysis is analogous to kinetic energy in physics. It's the fulfillment of the promise of potential energy. So what does this have to with Office 2010? In Excel 2010 we made it truly easy to connect to a SQL Azure...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Please recommend citations for source code documentation standards

    - by Aerik
    I'm trying to convince another group in my company that they need to provide more documentation in their source code (they want to hand off the code to my group) but they're treating it as a "nice to have". In my view, it's a necessity. I've run a source code analysis tool and it's showing about 10% comment lines - but looking at the source code, most of that is coming from entire functions that the author has commented out. Can anyone provide some authoritative citations / references for documentation / comment standards for source code? (In case it matters, we're a C# house, with a little Matlab thrown in).

    Read the article

  • How to submit sitemap when your website has partial https? - Error: "Not in Domain"

    - by Ralph N
    My website is an ecommerce that is set up to do http for the item browsing portion, but https for things like shopping cart, contact us, etc.. (anything that has forms on it). I've submitted my website a long time ago to google webmaster tools as http://www.mywebsite.com. I also submitted a sitemap with about 40 links - 8 of them are https. I've noticed that for the longest time, google webmaster tools was reporting that 32 out of the 40 links have been crawled. I tested all the links against my robots.txt and realized that my robots text was blocking the https links. Google says those links are "Not In Domain". Is there a way i'm supposed to get around this so that I can have a hybrid-ssl site? I understand the concept that one site is mywebsite.com:80 and the other is mywebsite.com:443, but i'd like to avoid submitting and maintaining 2 seperate websites on google webmaster tools.

    Read the article

  • Upcoming Webcast: “Supporting References” In Release 12 - SLA

    - by Oracle_EBS
    ADVISOR WEBCAST: “Supporting References” In Release 12 - SLAPRODUCT FAMILY: Receivables, Payables, General Ledger April 18, 2012 at 14:00 UK / 15:00 CET / 06:00 am Pacific / 7:00 am Mountain / 9:00 am Eastern "Supporting References” enables users to enter additional information for the “Journal Entry Header” and “Journal Entry Lines” that can be used for analytical purposes. This functionality was earlier known as “Analytical Criteria”. In 11i, additional data was interfaced to GL on the journals using “Descriptive Flexfields” whereas using this feature in R12, customers can create customized sources as ‘Supporting References’ and pass values into those sources from Subledgers like AP / AR / PA, etc. TOPICS WILL INCLUDE: Supporting References Business information about a subledger journal entry at the header or line level Establishing a subledger balance for a particular source value or combination of source values for a particular account To assist with reconciliation of account balances Financial and managerial analysis A short, live demonstration (only if applicable) and question and answer period will be included. Oracle Advisor Webcasts are dedicated to building your awareness around our products and services. This session does not replace offerings from Oracle Global Support Services. Current Schedule can be found on Note 740966.1 Post Presentation Recordings can be found on Note 740964.1

    Read the article

  • Google I/O 2010 - The joys of engineering leadership

    Google I/O 2010 - The joys of engineering leadership Google I/O 2010 - How to lose friends and alienate people: The joys of engineering leadership Tech Talks Brian W. Fitzpatrick, Ben Collins-Sussman Are you considered the 'point' person for your team? Do you have sweaty palms, headaches, and a calendar full of meetings? You may have an affliction called 'manager'. This is treatable through careful analysis and therapy. We'll examine how you may have arrived at this state and how you can once again regain your self-respect and the respect of your peers. Hear real-life stories of both good and bad leadership. Learn to lead by following. For all I/O 2010 sessions, please go to code.google.com/events/io/2010/sessions.html From: GoogleDevelopers Views: 6 0 ratings Time: 56:02 More in Science & Technology

    Read the article

  • JPA/EclipseLink multitenancy screencast

    - by alexismp
    I find JPA and in particular EclipseLink 2.3 to be particularly well suited to illustrate the concept of multitenancy, one of the key PaaS features en route for Java EE 7. Here's a short (5-minute) screencast showing GlassFish 3.1.1 (due out real soon now) and its EclipseLink 2.3 JPA provider showing multitenancy in action. In short, it adds EclipseLink annotations to a JPA entity and deploys two identical applications with different tenant-id properties defined in the persistence.xml descriptor. Each application only sees its own data, yet everything is stored in the same table which was augmented with a discriminator column. For more advanced uses such as tenant property being set on the @PersistenceContext, XML configuration of multitenant JPA entities, and more check out the nicely written wiki page.

    Read the article

  • SOA, Empowerment and Continuous Improvement

    - by Tanu Sood
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Rick Beers is Senior Director of Product Management for Oracle Fusion Middleware. Prior to joining Oracle, Rick held a variety of executive operational positions at Corning, Inc. and Bausch & Lomb. With a professional background that includes senior management positions in manufacturing, supply chain and information technology, Rick brings a unique set of experiences to cover the impact that technology can have on business models, processes and organizations. Rick will be hosting the IT Leader Editorial on a regular basis. I met my twin at Open World. We share backgrounds, experiences and even names. I hosted an invitation-only AppAdvantage Leadership Forum with an overcapacity 85 participants: 55 customers, 15 from the Oracle AppAdvantage team and 15 Partners. It was a lively, open and positive discussion of pace layered architectures and Oracle’s AppAdvantage approach to a unified view of Applications and Middleware. Rick Hassman from Pella was one of the customer panelists and during the pre event prep, Rick and I shared backgrounds and found that we had both been plant managers and led ERP deployments prior to leading IT itself. During the panel conversation I explored this with him, discussing the unique perspectives that this provides to CIO’s. He then hit on a point that I wasn’t able to fully appreciate until a week later. First though, some background. The week after the Forum, one of the participants emailed me with the following thoughts: “I am 150% behind this concept……but we are struggling with the concept of web services and the potential use of the Oracle Service Bus technology let alone moving into using the full SOA/BPM/BAM software to extend our JD Edwards application to both integrate and support business processes”. After thinking a bit I responded this way: While I certainly appreciate the degree of change and effort involved, perhaps I could offer the following: One of the underlying principles behind Oracle AppAdvantage is that more often than not, the choice between changing a business process and invasively customizing ERP represents a Hobson's Choice: neither is acceptable. In this case the third option, moving the process out of ERP, is the only acceptable one. Providing this choice typically requires end to end, real time interoperability across applications and/or services. This real time interoperability, to be sustainable over time requires a service oriented architecture. There's just no way around this. SOA adaptation is admittedly tough at the beginning. New skills, new technology and new headaches. But, like any radically new technology, it has a learning curve that drives cost down rather dramatically over time. Tough choices to be sure, but not entirely different than we face with every major technology cycle. Good points of course, but I felt that something was missing. The points were convincing, perhaps even a bit insightful, but they didn’t get at the heart of what Oracle AppAdvantage is focused upon: how the optimization of technology, applications, processes and relationships can change the very way that organizations operate. And then I thought back to the panel discussion with Rick Hassman at Oracle OpenWorld. Rick stressed that Continuous Improvement is a fundamental business strategy at Pella. I remember Continuous Improvement well as I suspect does everyone who was in American manufacturing during the 80’s. Pioneered by W. Edwards Deming in Japan (and still known alternatively as Kaizen), Continuous Improvement sets in place the business culture that we must not become complacent with success and resistant to the ongoing need for change. Many believe that this single handedly drove the renaissance in American manufacturing through the last two decades, which had become complacent during the 70’s and early 80’s. But what exactly does this have to do with SOA? It was Rick’s next point. He drew the connection that moving those business processes that need to continually change over time out of ERP and into edge applications and services enables continuous improvement by empowering people to continually strive for better ways of doing things rather than be being bound by workflows that cannot change. A compelling connection: that SOA, and the overall Oracle AppAdvantage framework of which it is an integral part, can empower people towards continuous improvement in business processes and as a result drive business leadership and business excellence. What better a case for technology innovation?

    Read the article

  • How to fix “Unable to cast COM object of type ‘Microsoft.SharePoint.Library.SPRequestInternalClass’ to interface type ‘Microsoft.SharePoint.Library.ISPRequest” using PowerGUI

    - by ybbest
    I got the error today when debugging some of my PowerShell Script in PowerGUI. The script works perfectly fine in PowerShell console. Then I had spent a couple of hours scratching my head, trying to figure out why. It turns out that the PowerShell Variables Panel causes the problem. Not quite sure why, but collapse the panel fix the problem. Problem: It throws the following exception when debugging my PowerShell Script. Analysis: It turns out that the PowerShell Variables Panel causes the problem. I assume it calls some function to grab value of some of variables which cause the problems. Solution: Collapse or Close the variables panel fix the problem

    Read the article

< Previous Page | 62 63 64 65 66 67 68 69 70 71 72 73  | Next Page >