Search Results

Search found 22526 results on 902 pages for 'multiple databases'.

Page 451/902 | < Previous Page | 447 448 449 450 451 452 453 454 455 456 457 458  | Next Page >

  • How does datomic handle "corrections"?

    - by blueberryfields
    tl;dr Rich Hickey describes datomic as a system which implicitly deals with timestamps associated with data storage from my experience, data is often imperfectly stored in systems, and on many occasions needs to retroactively be corrected (ie, often the question of "was a True on Tuesday at 12:00pm?" will have an incorrect answer stored in the database) This seems like a spot where the abstractions behind datomic might break - do they? If they don't, how does the system handle such corrections? Rich Hickey, in several of his talks, justifies the creation of datomic, and explains its benefits. His work, if I understand correctly, is motivated by core the insight that humans, when speaking about data and facts, implicitly associate some of the related context into their work(a date-time). By pushing the work required to manage the implicit date-time component of context into the database, he's created a system which is both much easier to understand, and much easier to program. This turns out to be relevant to most database programmers in practice - his work saves everyone a lot of time managing complex, hard to produce/debug/fix, time queries. However, especially in large databases, data is often damaged/incorrect (maybe it was not input correctly, maybe it eroded over time, etc...). While most database updates are insertions of new facts, and should indeed be treated that way, a non-trivial subset of the work required to manage time-queries has to do with retroactive updates. I have yet to see any documentation which explains how such corrections, or retroactive updates, are handled by datomic; from my experience, they are a non-trivial (and incredibly difficult to deal with) subset of time-related data manipulation that database programmers are faced with. Does datomic gracefully handle such updates? If so, how?

    Read the article

  • The five steps of business intelligence adoption: where are you?

    - by Red Gate Software BI Tools Team
    When I was in Orlando and New York last month, I spoke to a lot of business intelligence users. What they told me suggested a path of BI adoption. The user’s place on the path depends on the size and sophistication of their organisation. Step 1: A company with a database of customer transactions will often want to examine particular data, like revenue and unit sales over the last period for each product and territory. To do this, they probably use simple SQL queries or stored procedures to produce data on demand. Step 2: The results from step one are saved in an Excel document, so business users can analyse them with filters or pivot tables. Alternatively, SQL Server Reporting Services (SSRS) might be used to generate a report of the SQL query for display on an intranet page. Step 3: If these queries are run frequently, or business users want to explore data from multiple sources more freely, it may become necessary to create a new database structured for analysis rather than CRUD (create, retrieve, update, and delete). For example, data from more than one system — plus external information — may be incorporated into a data warehouse. This can become ‘one source of truth’ for the business’s operational activities. The warehouse will probably have a simple ‘star’ schema, with fact tables representing the measures to be analysed (e.g. unit sales, revenue) and dimension tables defining how this data is aggregated (e.g. by time, region or product). Reports can be generated from the warehouse with Excel, SSRS or other tools. Step 4: Not too long ago, Microsoft introduced an Excel plug-in, PowerPivot, which allows users to bring larger volumes of data into Excel documents and create links between multiple tables.  These BISM Tabular documents can be created by the database owners or other expert Excel users and viewed by anyone with Excel PowerPivot. Sometimes, business users may use PowerPivot to create reports directly from the primary database, bypassing the need for a data warehouse. This can introduce problems when there are misunderstandings of the database structure or no single ‘source of truth’ for key data. Step 5: Steps three or four are often enough to satisfy business intelligence needs, especially if users are sophisticated enough to work with the warehouse in Excel or SSRS. However, sometimes the relationships between data are too complex or the queries which aggregate across periods, regions etc are too slow. In these cases, it can be necessary to formalise how the data is analysed and pre-build some of the aggregations. To do this, a business intelligence professional will typically use SQL Server Analysis Services (SSAS) to create a multidimensional model — or “cube” — that more simply represents key measures and aggregates them across specified dimensions. Step five is where our tool, SSAS Compare, becomes useful, as it helps review and deploy changes from development to production. For us at Red Gate, the primary value of SSAS Compare is to establish a dialog with BI users, so we can develop a portfolio of products that support creation and deployment across a range of report and model types. For example, PowerPivot and the new BISM Tabular model create a potential customer base for tools that extend beyond BI professionals. We’re interested in learning where people are in this story, so we’ve created a six-question survey to find out. Whether you’re at step one or step five, we’d love to know how you use BI so we can decide how to build tools that solve your problems. So if you have a sixty seconds to spare, tell us on the survey!

    Read the article

  • .NET app - Should we use SQL Server and duplicate some reference data from an external Oracle DB? Or use Oracle and have a DB link?

    - by Daventry
    We're looking to migrate some existing Excel/Access processes into a new system which will provide the users with a Silverlight frontend to run and view the reports instead of using MS Access. The initial idea was to have SQL Server 2008 as RDBMS. The problem is that we've got some static data such as country codes, counterparties, etc which live in an existing Oracle DB. Since we do not want to duplicate that data (if possible), we were thinking of having a DB link between SQL Server and Oracle, but our firm does not allow that. So the options are either duplicate the data or use Oracle as RDBMS - surprise, the firm does allow DB links between Oracle databases. The initial idea was also to use WCF RIA Services, Entity Framework, etc which we're not sure they play well with Oracle, that's why it was decided to go with SQL Server in the first place. Would you advise to go for Oracle so that we can just link the static data? Or use SQL Server 2008 and replicate it because it's "safer" to stay within the Microsoft land? To use or not to use Entity Framework and WCF RIA Services at all? Regards. UPDATE: Thanks everyone for your answers. Nothing is set in stone yet. We'll try to import the data instead of linking, as if the other DB goes down, our system can still carry on. We're likely to use SQL Server just because most developers are more experienced with it. Even if we used RIA Services, we can swap out the Data Access Layer and use other frameworks such those mentioned below.

    Read the article

  • How to model the components of a non Information System?

    - by Adel C Kod
    So I am working on a project that's related to the Kernel code(specifically related to the TCP/IP stack of the kernel). I need to build some models to describe the functionality and components of my system. Initially I thought about Class Diagram, it can describe the general architecture of my system but it doesn't make sense since my code is VERY structured(written in standard C). I also thought about DFDs, they'd describe the processes of my system, and how the data is flowing. But they contain something which doesn't really fit in; data-storages. I have no databases here(at all). For the functionality, other team members suggested using Activity and Sequence diagrams, which is kinda okay with me, but what about the system components? So basically my question is; I want to describe the components of my system; what do you suggest as a meaningful diagram to follow? (Again, the project is a research low-level systems-oriented project with almost no user-interface at all)

    Read the article

  • Unified data source for k2 installed Joomla websites

    - by Özkan ÖZLÜ
    I am responsible for a few web sites of my organization. I use Joomla! 2.5.9 for those web sites. They all are running at the same server. I use K2 component for content managing. I have a general website in which shows all the staff information at the 'Staff' page. Also some of those people and their contents are shown in another department's website. So, there are databases for each web site. For example: In the general website (let's say general.org), when I click on the 'Staff' menu item, page shows all of the people work at my organization. Also they work at different departments. In another web site (eg: education.general.org) when I click on the 'Staff' menu item, it shows the people work at education department. But for each web site, I have different user accounts which means a modification in one of them does not affect the other one. If the one of the education staff tries to change his profile picture on the education web site, he also has to do it on the general web site. And sometimes one person might be working at two departments. Thus he has to edit three times of his data. Is it possible to merge the records for all websites? In other words, I want everyone to insert/update their data on the general web site, and the other web sites will be updated automatically.

    Read the article

  • Squibbly: LibreOffice Integration Framework for the Java Desktop

    - by Geertjan
    Squibbly is a new framework for Java desktop applications that need to integrate with LibreOffice, or more generally, need office features as part of a Java desktop solution that could include, for example, JavaFX components. Here's what it looks like, right now, on Ubuntu 13.04: Why is the framework called Squibbly? Because I needed a unique-ish name, because "squibble" sounds a bit like "scribble" (which is what one does with text documents, etc), and because of the many absurd definitions in the Urban Dictionary for the apparently real word "squibble", e.g., "A name for someone who is squibblish in nature." And, another e.g., "A squibble is a small squabble. A squabble is a little skirmish." But the real reason is the first definition (and definitely not the fourth definition): "Taking a small portion of another persons something, such as a small hit off of a pipe, a bite of food, a sip of a drink, or drag of a cigarette." In other words, I took (or "squibbled") a small portion of LibreOffice, i.e., OfficeBean, and integrated it into a NetBeans Platform application. Now anyone can add new features to it, to do anything they need, such as create a legislative software system as Propylon has done with their own solution on the NetBeans Platform: For me, the starting point was Chuk Munn Lee's similar solution from some years ago. However, he uses reflection a lot in that solution, because he didn't want to bundle the related JARs with the application. I understand that benefit but I find it even more beneficial to not need to require the user to specify the location of the LibreOffice location, since all the necessary JARs and native libraries (currently 32-bit Linux only, by the way) are bundled with the application. Plus, hundreds of lines of reflection code, as in Chuk's solution, is not fun to work with at all. Switching between applications is done like this: It's a work in progress, a proof of concept only. Just the result of a few hours of work to get the basic integration to work. Several problems remain, some of them potentially unsolvable, starting with these, but others will be added here as I identify them: Window management problems. I'd like to let the user have multiple LibreOffice applications and documents open at the same time, each in a new TopComponent. However, I haven't figured out how to do that. Right now, each application is opened into the same TopComponent, replacing the currently open application. I don't know the OfficeBean API well enough, e.g., should a single OfficeBean be shared among multiple TopComponents or should each of them have their own instance of it? Focus problems. When putting the application behind other applications and then switching back to the application, typing text becomes impossible. When closing a TopComponent and reopening it, the content is lost completely. Somehow the loss of focus, and then the return of focus, disables something. No idea how to fix that. The project is checked into this location, which isn't public yet, so you can't access it yet. Once it's publicly available, it would be great to get some code contributions and tweaks, etc. https://java.net/projects/squibbly Here's the source structure, showing especially how the OfficeBean JARs and native libraries (currently for Linux 32-bit only) fit in: Ultimately, would be cool to integrate or share code with http://joeffice.com!

    Read the article

  • PASS Summit Preconference and Sessions

    - by Davide Mauri
    I’m very pleased to announce that I’ll be delivering a Pre-Conference at PASS Summit 2012. I’ll speak about Business Intelligence again (as I did in 2010) but this time I’ll focus only on Data Warehouse, since it’s big topic even alone. I’ll discuss not only what is a Data Warehouse, how it can be modeled and built, but also how it’s development can be approached using and Agile approach, bringing the experience I gathered in this field. Building the Agile Data Warehouse with SQL Server 2012 http://www.sqlpass.org/summit/2012/Sessions/SessionDetails.aspx?sid=2821 I’m sure you’ll like it, especially if you’re starting to create a BI Solution and you’re wondering what is a Data Warehouse, if it is still useful nowadays that everyone talks about Self-Service BI and In-Memory databases, and what’s the correct path to follow in order to have a successful project up and running. Beside this Preconference, I’ll also deliver a regular session, this time related to database administration, monitoring and tuning: DMVs: Power in Your Hands http://www.sqlpass.org/summit/2012/Sessions/SessionDetails.aspx?sid=3204 Here we’ll dive into the most useful DMVs, so that you’ll see how that can help in everyday management in order to discover, understand and optimze you SQL Server installation, from the server itself to the single query. See you there!!!!!

    Read the article

  • Today's Links (6/27/2011)

    - by Bob Rhubart
    2011 Entrepreneurs of the Year, Northern California Region Drake Martinet reports on the new batch of entrepreneurs joining the ranks of Oracle CEO Larry Ellison, Yahoo CEO Carol Bartz and eBay co-founder Pierre Omidyar as the Norther California Region winners of Ernst & Young's Entrepreneurs of the Year awards. Technical Article: Caching Strategies for Oracle Service Bus 11g William Markito Oliveira illustrates how the right caching strategy can make a big difference in application performance. Kscope 11 - Day 1 and 2 Oracle ACE Director Markus Eisele checks in from Long Beach. Kaleidoscope 2011: Sunday’s Symposium And so does Oracle ACE Director Marco Gralike. Yet another GlassFish 3.1.1 promoted build | The Aquarium "This version was carefully designed to be highly compatible with the previous 3.x versions," says Alexis, "thus leaving you with little reasons not to upgrade as soon as it comes out this summer." Using NoSQL database in your Java EE 6 Applications on GlassFish - MongoDB for now! "The NoSQL databases are not intended to be a replacement for the mainstream RDBMS," says Arun Gupta. I have a performance problem | Alan Hargreaves Good (and entertaining) advice from an Australian Solaris and Network Domain TSC* Principal Field Technologist.

    Read the article

  • Oracle Executive Strategy Brief: Enterprise-Grade Cloud Applications

    - by B Shashikumar
    Cloud Computing has clearly evolved into one of the dominant secular trends in the industry. Organizations are looking to the cloud to change how they buy and consume IT. And its no longer about just lower up-front costs. The cloud promises to deliver greater agility and free up resources to focus on innovation versus running and maintaining systems. But are organizations actually realizing these benefits? The full promise of cloud is not being realized by customers who entrust their business to multiple niche cloud providers. While almost 9 out of 10 companies  expect more IT agility with cloud, only 47% are actually getting it (Source: 2011 State of Cloud Survey by Symantec). These niche cloud customers have also seen the promises of lower costs, efficiency gains, improved security, and compliance go unfulfilled. Having one cloud provider for customer relationship management (CRM) and another for human capital management (HCM), and then trying to glue these proprietary systems together while integrating to a back-office financial system can add to complexity and long-term costs. Completing a business process or generating an integrated report is cumbersome, and leverages incomplete data. Why can’t niche cloud providers deliver on the full promise of cloud? It’s simple: you still need to complete business processes. You still need reporting that enables you to take action using data from multiple systems. You still have to comply with SOX and other industry regulations. These requirements don’t go away just because you deploy in the cloud. Delivering lower up-front costs by enabling customers to buy software as a service (SaaS) is the easy part. To get real value that lasts longer than your quarterly report, it’s important to realize the benefits of cloud without compromising on functionality and while having the right level of control and flexibility. This is the true promise of cloud. Oracle’s cloud strategy centers around delivering the benefits of cloud—without compromise. We uniquely empower our customers with complete solutions and choice. From the richest functionality to integrated reporting and great user experience. It’s all available in the cloud. And it works not just with other Oracle cloud applications, but with your existing Oracle and third-party systems as well. This helps protect your current investments and extend their value as you journey to the cloud. We’ve made the necessary investments not only in our applications but also in the underlying technology that makes it all run—from the platform down to the hardware and operating system. We make it all. And we’ve engineered it to work together and be highly optimized for our customers, in the cloud. With Oracle enterprise-grade cloud applications, you get the benefits of cloud plus more power, more choice, and more confidence. Read more about how you can realize the true advantage of Cloud with Oracle Enterprise-grade Cloud applications in the Oracle Executive Strategy Brief here.  You can also attend an Oracle Cloud Conference event at a city near you. Register here. 

    Read the article

  • how to architect this to make it unit testable

    - by SOfanatic
    I'm currently working on a project where I'm receiving an object via web service (WSDL). The overall process is the following: Receive object - add/delete/update parts (or all) of it - and return the object with the changes made. The thing is that sometimes these changes are complicated and there is some logic involved, other databases, other web services, etc. so to facilitate this I'm creating a custom object that mimics the original one but has some enhanced functionality to make some things easier. So I'm trying to have this process: Receive original object - convert/copy it to custom object - add/delete/update - convert/copy it back to original object - return original object. Example: public class Row { public List<Field> Fields { get; set; } public string RowId { get; set; } public Row() { this.Fields = new List<Field>(); } } public class Field { public string Number { get; set; } public string Value { get; set; } } So for example, one of the "actions" to perform on this would be to find all Fields in a Row that match a Value equal to something, and update them with some other value. I have a CustomRow class that represents the Row class, how can I make this class unit testable? Do I have to create an interface ICustomRow to mock it in the unit test? If one of the actions is to sum all of the Values in the Fields that have a Number equal to 10, like this function, how can design the custom class to facilitate unit tests. Sample function: public int Sum(FieldNumber number) { return row.Fields.Where(x => x.FieldNumber.Equals(number)).Sum(x => x.FieldValue); } Am I approaching this the wrong way?

    Read the article

  • Webcast: DB Enterprise User Security Integration with Oracle Directory Services

    - by B Shashikumar
    The typical enterprise has a large number of DBA (Database administrator) accounts that are locally managed, which is often very costly, problematic and error-prone. Databases are a crucial component of your enterprise IT infrastructure, housing sensitive corporate data and database user accounts and privileges. To ensure the integrity of your enterprise's data, it's imperative to have a well-managed identity management system. This begins with centralized management of user accounts and access rights. Enterprise User Security (EUS), an Oracle Database Enterprise Edition feature, combined with Oracle Identity Management, gives you the ability to centrally manage database users and their authorizations in one central place. The cost of user provisioning and password resets is dramatically reduced. This technology is a must for new application development and should be considered for existing applications as well. Join Oracle Advisors for a live webcast on Jul 11 at 8am Pacific Time where Oracle experts will briefly introduce EUS, followed by a detailed discussion about the various directory options that are supported, including integration with Microsoft Active Directory. We'll conclude how to avoid common pitfalls deploying EUS with directory services. To register for this event, click here  

    Read the article

  • PASS Summit Preconference and Sessions

    - by Davide Mauri
    I’m very pleased to announce that I’ll be delivering a Pre-Conference at PASS Summit 2012. I’ll speak about Business Intelligence again (as I did in 2010) but this time I’ll focus only on Data Warehouse, since it’s big topic even alone. I’ll discuss not only what is a Data Warehouse, how it can be modeled and built, but also how it’s development can be approached using and Agile approach, bringing the experience I gathered in this field. Building the Agile Data Warehouse with SQL Server 2012 http://www.sqlpass.org/summit/2012/Sessions/SessionDetails.aspx?sid=2821 I’m sure you’ll like it, especially if you’re starting to create a BI Solution and you’re wondering what is a Data Warehouse, if it is still useful nowadays that everyone talks about Self-Service BI and In-Memory databases, and what’s the correct path to follow in order to have a successful project up and running. Beside this Preconference, I’ll also deliver a regular session, this time related to database administration, monitoring and tuning: DMVs: Power in Your Hands http://www.sqlpass.org/summit/2012/Sessions/SessionDetails.aspx?sid=3204 Here we’ll dive into the most useful DMVs, so that you’ll see how that can help in everyday management in order to discover, understand and optimze you SQL Server installation, from the server itself to the single query. See you there!!!!!

    Read the article

  • Free Webinar: A faster, cheaper, better IT Department with Azure

    - by Herve Roggero
    Join me for a free Webinar on Wednesday October 17th at 1:30PM, Eastern Time. I will discuss the benefits of cloud computing with the Azure platform. There isn’t a company out there that would say “No” to reduced IT costs and unlimited scaling bandwidth. This webinar will focus on the specific benefits of the Microsoft Azure cloud platform and will convince you on the sound business rationale behind moving to the cloud. From Infrastructure as a Service (Iaas) to Platform as a Service (Paas), Azure supports quick deployments, virtual machines, native SQL Databases and much more. Topics that will be discussed: - Why use Azure for your Cloud Computing needs - Iaas and Paas Offerings - Differing project approaches to Cloud computing - How Azure’s agility and reduced costs lead to better solutions Attendees of this webinar will also be eligible to receive the following: Free Two Hour Consultation which can include: - Review of Your Cloud Strategy - Cloud Roadmap Review - Review of Data-mart strategies - Review of Mobility Strategies Click Here to Register Now. About Herve Roggero Hervé Roggero, Azure MVP, is the founder of Blue Syntax Consulting, a company specialized in cloud computing products and services. Hervé's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Hervé holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Hervé is the co-author of "PRO SQL Azure" from Apress. For more information, visit www.bluesyntax.net.

    Read the article

  • Applying Service Pack 1 to Team Foundation Server 2010

    - by Enrique Lima
    Disclosure:  I performed the following activities on my Windows 7 SP1 system, Visual Studio 2010 SP1 and a local Basic installation of TFS 2010. As with any deployment of a service pack into a server environment, take your recommended precautions and be aware of the changes you are putting in.  With that said, make sure you backup your databases, and that you have an exit/rollback strategy in the event of an unexpected situation. Team Foundation Server 2010 Service Pack 1 corresponds to KB2182621.  The KB article is http://support.microsoft.com/kb/2182621 The process will be very simple to follow, you will need to execute the mu_team_foundation_server_2010_sp1_x86_x64_651711.exe file.  That will extract files needed and launch the wizard driven Installation. Once this process completes, you need to validate the changes. By looking at Team Foundation Server 2010 Administration Console, you should see the reference to the KB number and SP1. There is also a good reason to validate log locations and records. From the Team Foundation Server 2010 Administration Console. Or from Windows Explorer, go to the C:\ProgramData\Microsoft\Team Foundation\Server Configuration\Logs location and review the logs referenced by the servicing references.

    Read the article

  • SPARC SuperCluster: new Software Enhancements announced on December 4

    - by Giuseppe Facchetti
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} December 4, 2012: Oracle Unveils Cloud and Consolidation Capabilities for Oracle SPARC SuperCluster The latest SPARC SuperCluster update offers layered, zero-overhead virtualization for Mission-Critical applications. Oracle today announced new software enhancements to the Oracle SPARC SuperCluster engineered system which enable customers to consolidate any combination of mission-critical enterprise databases, middleware and applications on a single system and rapidly deploy secure, self-service cloud services. For all the details, click here.

    Read the article

  • Ask the Readers: How Many Monitors Do You Use with Your Computer?

    - by Asian Angel
    Most people have a single monitor for their computers, many have two, and some individuals enjoy “3 monitor plus” goodness. This week we would like to know how many monitors you use with your computer. Photo by DamnedNice. A good majority of people have a single monitor that they use with their computers and that single monitor serves their needs very well. It could be that these individuals do not engage in a heavy amount of work or play on their computers…they just need to do the basics like checking e-mail, using I.M., working with photos, etc. Another possibility is the use of virtual desktop software such as Dexpot, Yodm 3D, or Sysinternals Desktops on Windows systems. Linux systems such as Ubuntu already have that wonderful multi-desktop functionality built in. The wonderful part about virtual desktops is that a single monitor can feel equivalent to a small army of monitors. The ability to separate your open windows into “categories” and spread them out across multiple desktops is definitely nice. With each passing year dual monitor setups are becoming more common. Having twice the screen real-estate visible at the same time can be extremely convenient when you are multi-tasking. Perhaps you like to monitor your system’s stats and an e-mail account on the second monitor while working with software on the first. It certainly beats having windows popping up and down on your screen constantly while keeping on top of everything! Next we have the people who have three or more monitors in use with their computers. This may be a result of the type of work they do, an experiment to see if multiple monitors are right for them, or the cool, geeky factor that comes with having all those monitors. Needless to say these individuals can induce a good amount of envy and/or inspiration in the rest of us when we see their awesome setups. Are you perfectly content with a single monitor? Do you have two or more monitors that you use? If you have two or more monitors are they actually that useful to you? Perhaps you are getting ready even now to add additional monitors to your system. Whatever your situation may be at the moment, let us know your thoughts (and possible multi-monitor plans) in the comments! How-To Geek Polls require Javascript. Please Click Here to View the Poll. Latest Features How-To Geek ETC How to Use the Avira Rescue CD to Clean Your Infected PC The Complete List of iPad Tips, Tricks, and Tutorials Is Your Desktop Printer More Expensive Than Printing Services? 20 OS X Keyboard Shortcuts You Might Not Know HTG Explains: Which Linux File System Should You Choose? HTG Explains: Why Does Photo Paper Improve Print Quality? Hidden Tracks Your Stolen Mac; Free Until End of January Why the Other Checkout Line Always Moves Faster World of Warcraft Theme for Windows 7 Ubuntu Font Family Now Available for Download Oh No! WikiLeaks Published Santa Claus’s Naughty List [Video] Remember the Milk Now Supports HTTPS Encryption for the Entire Session

    Read the article

  • Adventures in Lab Management Configuration: CMMI Edition Part 1 of 3

    - by Enrique Lima
    I remember at one point someone telling me how close Migrate was to Migraine. This was a process that included an environment from TFS 2008 to TFS 2010, needed to be migrated too as far as the process template goes.  Here we are talking about CMMI v4.2 to CMMI v5.0.  Now, the process to migrate the TFS Infrastructure is one thing, migrating the Process Template is a different deal, not hard … just involved. Followed a combination of steps that came from a blog post as the main guidance and then MSDN (as suggested on the guidance post) to complement some tasks and steps. Again, the focus I have here is CMMI. The high level steps taken to enable the TFS 2008 CMMI v4.2 migrated to TFS 2010 Process Template are: 1)  Backup the Collection, Configuration and Warehouse Databases. 2)  Downloaded the Process Template using Visual Studio 2010. 3) Exported, modified and imported Bug Type Definition 4) Exported, modified and imported Scenario or Requirement Type Definition. 5) Created and imported bug field mappings. Now, we can attempt to connect using Test Manager, and you should be able to get this going. After that was done, it was time to enroll VMs that already existed in the environment.  This was a bit more challenging, but in the end it was a matter of just analyzing the changes that had been made to had a temporary work around from the time we migrated to the time we converted the Work Items and such and added fields to enable communication between the project and the Test and Lab Manager component. There are 2 more parts to this post, the second will describe the detailed steps taken to complete the Process Template update and the third will talk about the gotchas and fixes for the Lab Management portion.

    Read the article

  • What tools and knowledge do I need to create an application which generates bespoke automated e-mails? [on hold]

    - by Seraphina
    I'd like some suggestions as to how to best go about creating an application which can generate bespoke automated e-mails- i.e. send a personalized reply to a particular individual, interpreting the context of the message as intelligently as possible... (This is perhaps too big a question to be under one title?) What would be a good starting point? What concepts do I need to know? I'd imagine that the program needs to be able trawl through e-mails as and when they come in, and search for keywords in e-mail content, in order to write an appropriate reply. So there needs to be some form of automated response embedded in the code. Machine learning and databases come to mind here, as I'm aware that google incorporates machine learning already in gmail etc. It is quite tricky to google the above topic, and find the perfect tutorial. But there are some interesting articles and papers out there: Machine Learning in Automated Text Categorization (2002) by Fabrizio Sebastiani , Consiglio Nazionale Delle Ricerche However, this is not exactly a quick start guide. I intend to add to this question, and no doubt other questions will spark off this one. I look forward to suggestions.

    Read the article

  • Hostmonster can't change domains around?

    - by loneboat
    (question imported from http://superuser.com/q/204439/53847 ) Horrible title, but I couldn't think of a succinct way to summarize it to fit. I have HostMonster for my web hosting. I have several domain names under the same account (using the same web space, IP address, etc...). Every HM account has one domain set up as the "main domain", and all other domains are "secondary". The only way I have ever encountered this being an issue is in trying to use HTTPS - since (from my limited understanding) HTTPS encrypts headers, you can't route HTTPS requests to different virtual hosts on a server - only unencrypted requests, since it must look in the request to know where to route it. When I registered for my account, I only had one domain name (A). I have since added domain names (B), (C), (D), etc... At one point I switched domain name (B) to be my "main" domain name - so I could use HTTPS with it. I have since sold domain name (B), and would like to make domain name (A) my "main" one again (as it was before), but HM support says, "no, once a domain name has been a 'main' domain name on an account once, we can't set it up to be a 'main' domain name again. You're welcome to use domains (C), or (D), though.". They tell me the only way to reuse domain (A) as a "main" domain would be to set up a new account and transfer over all my files. I'm confused here. If I have domains (D), (E), and (F), they say I'm welcome to make one of them my new main domain name, just never (A) again, since I've already "used" it once. Calls to support only reveal that they can't let me do it because doing so would somehow "break" my account. Can anyone think of any good reason why this should be so? The only thing I can think is that maybe they're using the domain names as keys in some database or something? But if that's the case, that's ridiculous - they need to reorganize their databases!

    Read the article

  • Should I be learning Linq, Direct SQL Commands (in .net), EF or other?

    - by Wil
    Basically, I have a very good knowledge of plain old SQL coming from Classic ASP programming. Over the past couple of months, I have been learning C# and today was my first full day at MVC 3 (Razor) which I am loving! I need to get back in to Databases and I know that writing SqlCommand everywhere is obviously outdated (although it is nice I can still do it!). I used to go to a great usergroup as an IT Pro and the developer stuff went completely over my head, however I do remember a few things which kept coming up such as LINQ... However, that was some time ago and now the same people on Twitter are saying how out dated it is. I have tried to do research on both and I am clueless as to what direction I should go in, or when to use one over another (if learning both is a good thing). I am more so confused as I thought EF was a part of the .Net Framework, however, reading through the quick start guide, I had to download a component using Nuget. ... Basically I am out of my depth here and just need some honest advice of where to go!

    Read the article

  • Using R on your Oracle Data Warehouse

    - by jean-pierre.dijcks
    Since it is Predictive Analytics World in our backyard (or are we San Francisco’s backyard…?) I figured it is well worth the time to dust of some old but important news. With big data (should we start calling it “any data analytics” instead?) being the buzz word and analytics the key operative goal, not moving data around is becoming more and more critical to the business users. Why? Because instead of spending time on moving data around into your next analytics server you should be running analytics on those CPUs. You could always do this with Oracle Data Mining within the Oracle Database. But a lot of folks want to leverage R as their main tool. Well, this article describes how you can do this, since 2010… As Casimir Saternos concludes in the article; “There is a growing awareness of the need to effectively analyze astronomical amounts of data, much of which is stored in Oracle databases. Statistics and modeling techniques are used to improve a wide variety of business functions. ODM accessed using the R language increases the value of your data by uncovering additional information. RODM is a powerful tool to enable your organization to make predictions, classify data, and create visualizations that maximize effectiveness and efficiencies.” Happy Analysis!

    Read the article

  • Should one use a separate database for application data and user data?

    - by trycatch
    I’ve been working on a project for a little while and I’m unsure which is the better architecture. I’m interested in the consensus. The answer to me seems fairly obvious but something about it is digging at me and I can't pick out what. The TL;DR is: how do you handle a program with application data and user data in the same DB which needs to be able to receive updates to the application data periodically? One database for user data and one for application, or both in one? The detailed version is.. if an application has a database which needs to maintain application data AND user data, and the user data all references application data, it feels more natural to me to store them in the same database. But if there exists a need to be able to update the application data within this database periodically, should this be stripped into two databases so that one can simply download the updated application data database file as an update and replace the old one? Or should they remain as one database, and the application data be updated via a script which inserts the new data into the existing database? The second sounds clearly preferable to me... but for some reason just doesn’t feel right, and I can't pick out quite why.

    Read the article

  • What to learn after standard C++?

    - by Luca Cerone
    I switched to C++ a few months ago, learning its syntax, the main features of the STL and what you can usually find in a "learn C++" manual. Now I would like to go further. What would be your recommendations? I would like to know what to learn next (not only about the language, but also debugging, frameworks etc. etc.) I know probably the answer depends on the specific needs of each user, so here is a list of mine: Cross Platform development Developing GUI for my programs Develop extendible software, allowing the use of plugins Use of scientific libraries Interact with databases (mainly MySQL) Having server/client functionalities (I'd like users of my programs to interact through internet.. as you might have guessed I am not a programmer by training so I might have used the wrong terms.. if so I apologize for that). Of course I know it takes time, but I would like to have a good list of references and resources to start (both books and websites are ok). Thanks a lot for your help!

    Read the article

  • So my employer wants me to do less programming and focus on IT support

    - by Rich
    I was hired into a non tech company's IT department as a programmer a few years back, and after several rounds of lay offs, we're down to a skeleton crew. I've saved the company hundreds of thousands of dollars with my projects and management has been happy with them (although most of the stakeholders have since left the company). Management now wants me to limit the programming that I do and spend most of my time on IT support: putting out fires, dealing with vendors, outsourced contractors, supporting company systems, managing projects, etc. I am a little burnt out on programming since I've been pushed pretty hard for the past several years. However, I'm not sure if this is a good career move in the long run. I'm a decent programmer (and also good with databases) but not obsessed with it to the point of coding outside of work. I'm approaching my mid 30s and there's potential ageism to deal with down the line. While I'm fortunate to have survived the lay offs, it sorta feels like my job is being "dumbed down". I have both good technical skills and people skills...but it doesn't take a genius to do what I'm doing now. And my success is being increasingly linked to others' performance rather than my own... Just looking for some advice. Is it time to move on? That's not really an easy thing to do since I'd likely have to move to another area to find another comparable tech job. Should I go after another pure technical role? Or should I stay and try to make this work? People say do what you "enjoy" but it doesn't really matter to me as long as I'm getting paid. Also the ageism thing is on the horizon and could be an issue eventually. I'm making a decent (but not great) salary. Should I chase money and maximize my income while I still have a chance? Or be happy with a moderate salary and 40 hour work week?

    Read the article

  • Multichannel Digital Engagement: Find Out How Your Organization Measures Up

    - by Michael Snow
    This article was originally published in the September 2013 Edition of the Oracle Information InDepth Newsletter ORACLE WEBCENTER EDITION Thanks to mobile and social technologies, interactive online experiences are now commonplace. Not only that, they give consumers more choices, influence, and control than ever before. So how can you make your organization stand out? The key building blocks for delivering exceptional cross-channel digital experiences are outlined below. Also, a new assessment tool is available to help you measure your organization's ability to deliver such experiences. A clearly defined digital strategy. The customer journey is growing increasingly complex, encompassing multiple touchpoints and channels. It used to be easy to map marketing efforts to specific offline channels; for example, a direct mail piece with an offer to visit a store for a discounted purchase. Now it is more difficult to cultivate and track such clear cause-and-effect relationships. To deliver an integrated digital experience in this more complex world, organizations need a clearly defined and comprehensive digital marketing strategy that is backed up by an integrated set of software, middleware, and hardware solutions. Strong support for business agility and speed-to-market. As both IT and marketing executives know, speed-to-market and business agility are key to competitive advantage. That means marketers need solutions to support the rapid implementation of online marketing initiatives—plus the flexibility to adapt quickly to a changing marketplace. And IT needs tools with the performance, scalability, and ease of integration to support marketing efforts. Both teams benefit when business users are empowered to implement marketing initiatives on their own, with minimal IT intervention. The ability to deliver relevant, personalized content. Delivering a one-size-fits-all online customer experience is no longer acceptable. Customers expect you to know who they are, including their preferences and past relationship with your brand. That means delivering the most relevant content from the moment a visitor enters your site. To make that happen, you need a powerful rules engine so that marketers and business users can easily define site visitor segments and deliver content accordingly. That includes both implicit targeting that is based on the user’s behavior, and explicit targeting that takes a user’s profile information into account. Ideally, the rules engine can also intelligently weight recommendations when multiple segments apply to a specific customer. Support for social interactivity. With the advent of Facebook and LinkedIn, visitors expect to participate in and contribute to your web presence—and share their experience on their own social networks. That requires easy incorporation of user-generated content such as comments, ratings, reviews, polls, and blogs; seamless integration with third-party social networking sites; and support for social login, which helps to remove barriers to social participation. The ability to deliver connected, multichannel experiences that include powerful, flexible mobile capabilities. By 2015, mobile usage is projected to surpass that of PCs and other wired devices. In other words, mobile is an essential element in delivering exceptional online customer experiences. This requires the creation and management of mobile experiences that are optimized for delivery to the thousands of different devices that are in use today. Just as important, organizations must be able to easily extend their traditional web presence to the mobile channel and deliver highly personalized and relevant multichannel marketing initiatives while also managing to minimize the time and effort required to manage mobile sites. Are you curious to know how your organization measures up when it comes to delivering an engaging, multichannel digital experience? If so, take this brief, 15-question online assessment and see how your organization scores in the areas of digital strategy, digital agility, relevance and personalization, social interactivity, and multichannel experience.

    Read the article

< Previous Page | 447 448 449 450 451 452 453 454 455 456 457 458  | Next Page >