Search Results

Search found 4137 results on 166 pages for 'reports'.

Page 11/166 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Crystal Reports API - chart: "for all records" or "for each record"?

    - by Epaga
    Is there any way to determine whether a chart in Crystal Reports 2008 (using either the RAS SDK or the older RDC API) is set to display values "for each record" or "for all records"? I can get access to a CrystalDecisions.ReportAppServer.ReportDefModel.ChartObject but can't find any API there to access which type of chart it is - "for each" or "for all".

    Read the article

  • Reports in Java. What tool to use?

    - by Tom
    Hi, I need to create some reports, in different formats (xls, pdf, rtf). I am currently using JasperReports, in conjunction with IReport. I have no major complaints about it (except for the cases when IReport messes up my xml files), but i've been having some problems with it, when exporting to xls files and with some "special" characters, such as '&'. Is there a widely use alternative? Is JasperReports the right choice?

    Read the article

  • Using C# with Crystal Reports, How Can I Create 4-Up Subreports?

    - by C. Griffin
    The simplest example that I can provide for what I want to do is this: I need to create a Report, whose only requirement is that I have (4) of the same subreport on the page (imagine 4 portrait-oriented post cards on a page), each quadrant using a separate row from my datatable, yet all 4 are identical in terms of fields. If there are more than 4, it needs to carry over to a new page with the same format. I'm using C# and the built-in Crystal Reports Basic for the task.

    Read the article

  • How to show different icons in Crystal reports depending on the field value?

    - by DarkDeny
    We are using Crystal Report 12 in one of our projects. Currently I need to create report template which should show different icons based on the some field value. That field contains a number, storing some kind of status and I have several icons corresponding some statuses. At the moment I can't figure out how to implement such a thing in Crystal Reports designer. Could someone please help me?

    Read the article

  • Installing Forms and Reports on a development system

    - by Duncan Mills
    By popular demand I've resurrected / updated one of the old blog postings from Jan Carlin's Blog on GroundSide here. A recent (lengthy) post on the Forms forums chronicles the problems some of you have had installing F&R on a development machine. See the link in the headline of this post for the main one. When installing, here are some points to bear in mind: Download and install Weblogic Server first. http://www.oracle.com/technology/software/products/middleware/index.htmlFind the Forms and Reports (and Disco and Portal) zip files here. Download them to the desktop (or some other temporary directory of your choosing). Unzip both of the two zip files into the same new directory (maybe called 'stage') and check that you have 4 directories in the stage dir when you are finished unzipping: 'Disk1', 'Disk2', 'Disk3' and 'Disk4'. These folders are specified in the zip file structure and must be preserved for the setup executable to work. If you use WinZip and have a right click menu option that say "Extract to here", use that by right click-dragging the zip file onto the newly created directory. Don't use the "Extract to folder %HOME%\Desktop\ofm_pfrd...disk_1of2" option. That will get you into the trouble that was reported early in this thread. Free up as much memory as you can. Stop services and background processes and virus scanners and databases (you don't need a DB to install Forms) and other things lurking about on your machine. You can restart them when the install is done. Around 1.5 GB free real memory should do it. If it doesn't, free up more if you can. Don't change the swap space unless you know what you are doing. Let Windows handle it. A 1 GB machine will likely not be enough. You will likely need at least 2GB of RAM.Start the install with setup.exe from the 'Disk1' directoryChoose the Install and Configure option unless you have a good reason not to.Choose a unique instance name even if you deinstalled and removed the last install. I suggest using 'asinst_20090722_1' (today's date in ISO format with a running incremented number at the end if you install more than two times on a particular day).Unselect Portal and Discoverer and select the Builders you want.Unselect WebCacheUnselect OHS.Unselect the single sign-on option Check for any failures and choose the retry option if any occur. If that doesn't fix the problem, call Oracle Customer Support .

    Read the article

  • Izenda Reports 6.3 Top 10 Features

    - by gt0084e1
    Izenda 6.3 Top 10 New Features and Capabilities 1. Izenda Maps Add-On The Izenda Maps add-on allows rapid visualization of geographic or geo-spacial data.  It is fully integrated with the the rest of Izenda report package and adds a Maps tab which allows users to add interactive maps to their reports. Contact your representative or [email protected] for limited time discounts. Izenda Maps even has rich drill-down capabilities that allow you to dive deeper with a simple hover (also requires dashboards). 2. Streamlined Pie Charts with "Other" Slices The advanced properties of the Pie Chart now allows you to combine the smaller slices into a single "Other" slice. This reduces the visual complexity without throwing off the scale of the chart. Compare the difference below. 3. Combined Bar + Line Charts The Bar chart now allows dual visualization of multiple metrics simultaneously by adding a line for secondary data. Enabled via AdHocSettings.AllowLineOnBar = true; 4. Stacked Bar Charts The stacked bar chart lets you see a breakdown of a measure based on categorical data.  It is enabled with the following code. AdHocSettings.AllowStackedBarChart = true; 5. Self-Joining Data Sources The self-join features allows for parent-child relationships to be accessed from the Data Sources tab. The same table can be used as a secondary child table within the Report Designer. 6. Report Design From Dashboard View Dashboards now sport both view and design icons to allow quick access to both. 7. Field Arithmetic on Dates Differences between dates can now be used as measures with the arithmetic feature. 8. Simplified Multi-Tenancy Integrating with multi-tenant systems is now easier than ever. The following APIs have been added to facilitate common scenarios. AdHocSettings.CurrentUserTenantId = value; AdHocSettings.SchedulerTenantID = value; AdHocSettings.SchedulerTenantField = "AccountID"; 9. Support For SQL 2008 R2 and SQL Azure Izenda now supports the latest version of Microsoft's database as well as the SQL Azure service. 10. Enhanced Performance and Compatibility for Stored Procedures Izenda now supports more stored procedures than ever and runs them faster too.

    Read the article

  • MaxTotalSizeInBytes - Blind spots in Usage file and Web Analytics Reports

    - by Gino Abraham
    Originally posted on: http://geekswithblogs.net/GinoAbraham/archive/2013/10/28/maxtotalsizeinbytes---blind-spots-in-usage-file-and-web-analytics.aspx http://blogs.msdn.com/b/sharepoint_strategery/archive/2012/04/16/usage-file-and-web-analytics-reports-with-blind-spots.aspx In my previous post (Troubleshooting SharePoint 2010 Web Analytics), I referenced a problem that can occur when exceeding the daily partition size for the LoggingDB, which generates the ULS message “[Partition] has exceeded the max bytes”. Below, I wanted to provide some additional info on this particular issue and help identify some options if this occurs. As an aside, this post only applies if you are missing portions of Usage data - think blind spots on intermittent days or user activity regularly sparse for the afternoon/evening. If this fits your scenario - read on. But if Usage logs are outright missing, go check out my Troubleshooting post first.  Background on the problem:The LoggingDB database has a default maximum size of ~6GB. However, SharePoint evenly splits this total size into fixed sized logical partitions – and the number of partitions is defined by the number of days to retain Usage data (by default 14 days). In this case, 14 partitions would be created to account for the 14 days of retention. If the retention were halved to 7 days, the LoggingDBwould be split into 7 corresponding partitions at twice the size. In other words, the partition size is generally defined as [max size for DB] / [number of retention days].Going back to the default scenario, the “max size” for the LoggingDB is 6200000000 bytes (~6GB) and the retention period is 14 days. Using our formula, this would be [~6GB] / [14 days], which equates to 444858368 bytes (~425MB) per partition per day. Again, if the retention were halved to 7 days (which halves the number of partitions), the resulting partition size becomes [~6GB] / [7 days], or ~850MB per partition.From my experience, when the partition size for any given day is exceeded, the usage logging for the remainder of the day is essentially thrown away because SharePoint won’t allow any more to be written to that day’s partition. The only clue that this is occurring (beyond truncated usage data) is an error such as the following that gets reported in the ULS:04/08/2012 09:30:04.78    OWSTIMER.EXE (0x1E24)    0x2C98    SharePoint Foundation    Health    i0m6     High    Table RequestUsage_Partition12 has 444858368 bytes that has exceeded the max bytes 444858368It’s also worth noting that the exact bytes reported (e.g. ‘444858368’ above) may slightly vary among farms. For example, you may instead see 445226812, 439123456, or something else in the ballpark. The exact number itself doesn't matter, but this error message intends to indicates that the reporting usage has exceeded the partition size for the given day.What it means:The error itself is easy to miss, which can lead to substantial gaps in the reporting data (your mileage may vary) if not identified. At this point, I can only advise to periodically check the ULS logs for this message. Down the road, I plan to explore if [Developing a Custom Health Rule] could be leveraged to identify the issue (If you've ever built Custom Health Rules, I'd be interested to hear about your experiences). Overcoming this issue also poses a challenge, with workaround options including:Lower the retentionBecause the partition size is generally defined as [max size] / [number of retention days], the first option is to lower the number of days to retain the data – the lower the retention, the lower the divisor and thus a bigger partition. For example, halving the retention from 14 to 7 days would halve the number of partitions, but double the partition size to ~850MB (e.g. [6200000000 bytes] / [7 days] = ~850GB partitions). Lowering it to 2 days would result in two ~3GB partitions… and so on.Recreate the LoggingDB with an increased sizeThe property MaxTotalSizeInBytes is exposed by OM code for the SPUsageDefinition object and can be updated with the example PowerShell snippet below. However, updating this value has no immediate impact because this size only applies when creating a LoggingDB. Therefore, you must create a newLoggingDB for the Usage Service Application. The gotcha: this effectively deletes all prior Usage databecause the Usage Service Application can only have a single LoggingDB.Here is an example snippet to update the "Page Requests" Usage Definition:$def=Get-SPUsageDefinition -Identity "page requests" $def.MaxTotalSizeInBytes=12400000000 $def.update()Create a new Logging database and attach to the Usage Service Application using the following command: Get-spusageapplication | Set-SPUsageApplication -DatabaseServer <dbServer> -DatabaseName <newDBname> Updated (5/10/2012): Once the new database has been created, you can confirm the setting has truly taken by running the following SQL Query (be sure to replace the database name in the following query with the name provided in the PowerShell above)SELECT * FROM [WSS_UsageApplication].[dbo].[Configuration] WITH (nolock) WHERE ConfigName LIKE 'Max Total Bytes - RequestUsage'

    Read the article

  • What is a typical scenario for and end-user reports design?

    - by Sebastian
    Hello! I'm wondering what would be the typical scenario for using an end-user report designer. What I'm thinking of is to have a base report with all the columns that I can have, also with a basic view of the report (formatting, order of columns, etc.) and then let the user to change that format and order, take out or add (from the available columns) data to it, etc. Is that a common way to address what is called end-user designer for reports or I'm off track? I know it depends on the user (if it's someone that can handle SQL or not for example), but is it common to have a scenario where the user can build everthing from the sql query to the formatting? Thanks! Sebastian

    Read the article

  • Calculation with dates and different locales in Crystal Reports for Eclipse?

    - by Bevor
    Hello, I'm using Crystal Reports for Eclipse 2.0.4 and I have a problem. I use a formula in an report to subtract one day from a string which is a date: ToText(CDate({Agreement.EndDate})-1, "dd.MM.yyyy"); This works for the German locale. With an English locale, the calculation is absolutely wrong because the day and month is interchanged. For example: When {Agreement.EndDate} is 07.05.2010 and I subtract one day from it, I get 06.04.2010 with the German locale but 04.07.2010 with an English locale. How can I solve this that I works for different locales?

    Read the article

  • Creating an ASP.NET report using Visual Studio 2010 - Part 3

    - by rajbk
    We continue building our report in this three part series. Creating an ASP.NET report using Visual Studio 2010 - Part 1 Creating an ASP.NET report using Visual Studio 2010 - Part 2 Adding the ReportViewer control and filter drop downs. Open the source code for index.aspx and add a ScriptManager control. This control is required for the ReportViewer control. Add a DropDownList for the categories and suppliers. Add the ReportViewer control. The markup after these steps is shown below. <div> <asp:ScriptManager ID="smScriptManager" runat="server"> </asp:ScriptManager> <div id="searchFilter"> Filter by: Category : <asp:DropDownList ID="ddlCategories" runat="server" /> and Supplier : <asp:DropDownList ID="ddlSuppliers" runat="server" /> </div> <rsweb:ReportViewer ID="rvProducts" runat="server"> </rsweb:ReportViewer> </div> The design view for index.aspx is shown below. The dropdowns will display the categories and suppliers in the database. Changing the selection in the drop downs will cause the report to be filtered by the selections in the dropdowns. You will see how to do this in the next steps.   Attaching the RDLC to the ReportViewer control by clicking on the top right of the control, going to Report Viewer tasks and selecting Products.rdlc.   Resize the ReportViewer control by dragging at the bottom right corner. I set mine to 800px x 500px. You can also set this value in source view. Defining the data sources. We will now define the Data Source used to populate the report. Go back to the “ReportViewer Tasks” and select “Choose Data Sources” Select a “New data source..” Select “Object” and name your Data Source ID “odsProducts”   In the next screen, choose “ProductRepository” as your business object. Choose “GetProductsProjected” in the next screen.   The method requires a SupplierID and CategoryID. We will set these so that our data source gets the values from the drop down lists we defined earlier. Set the parameter source to be of type “Control” and set the ControlIDs to be ddlSuppliers and ddlCategories respectively. Your screen will look like this: We are now going to define the data source for our drop downs. Select the ddlCategory drop down and pick “Choose Data Source”. Pick “Object” and give it an id “odsCategories”   In the next screen, choose “ProductRepository” Select the GetCategories() method in the next screen.   Select “CategoryName” and “CategoryID” in the next screen. We are done defining the data source for the Category drop down. Perform the same steps for the Suppliers drop down.   Select each dropdown and set the AppendDataBoundItems to true and AutoPostback to true.     The AppendDataBoundItems is needed because we are going to insert an “All“ list item with a value of empty. Go to each drop down and add this list item markup as shown below> Finally, double click on each drop down in the designer and add the following code in the code behind. This along with the “Autopostback= true” attribute refreshes the report anytime a drop down is changed. protected void ddlCategories_SelectedIndexChanged(object sender, EventArgs e) { rvProducts.LocalReport.Refresh(); }   protected void ddlSuppliers_SelectedIndexChanged(object sender, EventArgs e) { rvProducts.LocalReport.Refresh(); } Compile your report and run the page. You should see the report rendered. Note that the tool bar in the ReportViewer control gives you a couple of options including the ability to export the data to Excel, PDF or word.   Conclusion Through this three part series, we did the following: Created a data layer for use by our RDLC. Created an RDLC using the report wizard and define a dataset for the report. Used the report design surface to design our report including adding a chart. Used the ReportViewer control to attach the RDLC. Connected our ReportWiewer to a data source and take parameter values from the drop downlists. Used AutoPostBack to refresh the reports when the dropdown selection was changed. RDLCs allow you to create interactive reports including drill downs and grouping. For even more advanced reports you can use Microsoft® SQL Server™ Reporting Services with RDLs. With RDLs, the report is rendered on the report server instead of the web server. Another nice thing about RDLs is that you can define a parameter list for the report and it gets rendered automatically for you. RDLCs and RDLs both have their advantages and its best to compare them and choose the right one for your requirements. Download VS2010 RTM Sample project NorthwindReports.zip   Alfred Borden: Are you watching closely?

    Read the article

  • SQL SERVER – SSMS: Database Consistency History Report

    - by Pinal Dave
    Doctor and Database The last place I like to visit is always a hospital. With the monsoon season starting, intermittent rains, it has become sort of a routine to get a cycle of fever every other year (seriously I hate it). So when I visit my doctor, it is always interesting in the way he quizzes me. The routine question of – “How many days have you had this?”, “Is there any pattern?”, “Did you drench in rain?”, “Do you have any other symptom?” and so on. The idea here is that the doctor wants to find any anomaly or a pattern that will guide him to a viral or bacterial type. Most of the time they get it based on experience and sometimes after a battery of tests. So if there is consistent behavior to your problem, there is always a solution out. SQL Server has its way to find if the server data / files are in consistent state using the DBCC commands. Back to SQL Server In real life, Database consistency check is one of the critical operations a DBA generally doesn’t give much priority. Many readers of my blogs have asked many times, how do we know if the database is consistent? How do I read output of DBCC CHECKDB and find if everything is right or not? My common answer to all of them is – look at the bottom of checkdb (or checktable) output and look for below line. CHECKDB found 0 allocation errors and 0 consistency errors in database ‘DatabaseName’. Above is a “good sign” because we are seeing zero allocation and zero consistency error. If you are seeing non-zero errors then there is some problem with the database. Sample output is shown as below: CHECKDB found 0 allocation errors and 2 consistency errors in database ‘DatabaseName’. repair_allow_data_loss is the minimum repair level for the errors found by DBCC CHECKDB (DatabaseName). If we see non-zero error then most of the time (not always) we get repair options depending on the level of corruption. There is risk involved with above option (repair_allow_data_loss), that is – we would lose the data. Sometimes the option would be repair_rebuild which is little safer. Though these options are available, it is important to find the root cause to the problem. In standard report, there is a report which can show the history of checkdb executed for the selected database. Since this is a database level report, we need to right click on database, click Reports, click Standard Reports and then choose “Database Consistency History” report. The information in this report is picked from default trace. If default trace is disabled or there is no checkdb run or information is not there in default trace (because it’s rolled over), we would get report like below. As we can see report says it very clearly: Currently, no execution history of CHECKDB is available or default trace is not enabled. To demonstrate, I have caused corruption in one of the database and did below steps. Run CheckDB so that errors are reported. Fix the corruption by losing the data using repair option Run CheckDB again to check if corruption is cleared. After that I have launched the report and below is what we would see. If you are lazy like me and don’t want to run the report manually for each database then below query would be handy to provide same report for all database. This query is runs behind the scenes by the report. All I have done is remove the filter for database name (at the last – highlighted). DECLARE @curr_tracefilename VARCHAR(500); DECLARE @base_tracefilename VARCHAR(500); DECLARE @indx INT; SELECT @curr_tracefilename = path FROM sys.traces WHERE is_default = 1; SET @curr_tracefilename = REVERSE(@curr_tracefilename); SELECT @indx  = PATINDEX('%\%', @curr_tracefilename) ; SET @curr_tracefilename = REVERSE(@curr_tracefilename); SET @base_tracefilename = LEFT( @curr_tracefilename,LEN(@curr_tracefilename) - @indx) + '\log.trc'; SELECT  SUBSTRING(CONVERT(NVARCHAR(MAX),TEXTData),36, PATINDEX('%executed%',TEXTData)-36) AS command ,       LoginName ,       StartTime ,       CONVERT(INT,SUBSTRING(CONVERT(NVARCHAR(MAX),TEXTData),PATINDEX('%found%',TEXTData) +6,PATINDEX('%errors %',TEXTData)-PATINDEX('%found%',TEXTData)-6)) AS errors ,       CONVERT(INT,SUBSTRING(CONVERT(NVARCHAR(MAX),TEXTData),PATINDEX('%repaired%',TEXTData) +9,PATINDEX('%errors.%',TEXTData)-PATINDEX('%repaired%',TEXTData)-9)) repaired ,       SUBSTRING(CONVERT(NVARCHAR(MAX),TEXTData),PATINDEX('%time:%',TEXTData)+6,PATINDEX('%hours%',TEXTData)-PATINDEX('%time:%',TEXTData)-6)+':'+SUBSTRING(CONVERT(NVARCHAR(MAX),TEXTData),PATINDEX('%hours%',TEXTData) +6,PATINDEX('%minutes%',TEXTData)-PATINDEX('%hours%',TEXTData)-6)+':'+SUBSTRING(CONVERT(NVARCHAR(MAX),TEXTData),PATINDEX('%minutes%',TEXTData) +8,PATINDEX('%seconds.%',TEXTData)-PATINDEX('%minutes%',TEXTData)-8) AS time FROM::fn_trace_gettable( @base_tracefilename, DEFAULT) WHERE EventClass = 22 AND SUBSTRING(TEXTData,36,12) = 'DBCC CHECKDB' -- AND DatabaseName = @DatabaseName; Don’t get worried about the logic above. All it is doing is reading the trace files, parsing below entry and getting out information for underlined words. DBCC CHECKDB (CorruptedDatabase) executed by sa found 2 errors and repaired 0 errors. Elapsed time: 0 hours 0 minutes 0 seconds.  Internal database snapshot has split point LSN = 00000029:00000030:0001 and first LSN = 00000029:00000020:0001. Hopefully now onwards you would run checkdb and understand the importance of it. As responsible DBAs I am sure you are already doing it, let me know how often do you actually run them on you production environment? Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL Tagged: SQL Reports

    Read the article

  • How Can I Create Reports in a Custom C#.NET Windows Application? - General Question

    - by user311509
    Assume i have a custom Windows application written in C#. This application has only the following functionalists, add, edit, delete and view. For example, a user can add a sale, change sales record, delete a sale record or view the whole sales record. I need to add some reporting functionalists e.g. i want a user to print the sales of a certain customer from 2008 to 2009 into pdf, what all products a certain customer has purchased from us and so on. I will only include the basic common report requests that are usually needed in the office. Any other kind of reports that are requested inconsistently, i would do it manually from my side at the back end and send the results manually to the requester. What i would do is: If a user wants more info of a certain customer, a special window box appears for that customer. This window box will have different controls that allows user to request more info such as, print customer purchases from ..... to ..... (user chooses the dates) and user will view results in pdf or so. Of course, at the back scene i will write an appropriate SQL Query with parameters that meets a certain function. Is this how it should be done? I have heard about SQL Reporting, i don't know anything about it yet. I will check it out. Anyhow, your suggestions won't harm. I'm still a student, so i don't have practical experience yet. I hope my question is clear enough. Thank you.

    Read the article

  • How do I show all group headers in Access 2007 reports?

    - by Newbie
    This is a question about Reports in Access 2007. I'm unsure whether the solution will involve any programming, but hopefully someone will be able to help me. I have a report which lists all records from a particular table (call it A), and groups them by their associated record in a related table (call it B). I use the 'group headers' to add the information from table-B into the report. The problem occurs when I filter the records from table-A that are shown in the report. If I filter out all table-A records that relate to a particular record (call it X) in table-B, the report no longer shows the record-X group header. As a possible workaround, I have tried to ensure that I have one empty record in table-A for each of the records in table-B. That way I can specify NOT to filter out these empty records. However, the outcome is ugly one-record-high blank spaces at the start of each group in the report. Does anyone know of an alternative solution?

    Read the article

  • Crystal Reports : How to add an external assembly class?

    - by Sunil
    I am using VS2010, CrystalReport13 & MVC3. My problem is unable to add an external assembly in Crystal Report using "Database Expert" Option. I have a class named WeeklyReportModel in an external assembly. In my web project, data retrieving from DB as IEnumerable collection of WeeklyReportModel. I tried ProjectData - .NetObjects in Crystal Report for adding the WeeklyReportModel. But this external assembly is not showing under ".NetObjects". Then I tried other option as Create New Connection - ADO.Net – Make New Connection and pointed this External Assembly. It has been added under Ado.Net node, but while expanding displays as "...no items found..." Totally frustrated. Please help. External Assembly Class: namespace SMS.Domain { public class WeeklyReportModel { public int StoreId { get; set; } public string StoreName{ get; set; } public decimal Saturday { get; set; } public decimal Sunday { get; set; } public decimal Monday { get; set; } public decimal Tuesday { get; set; } public decimal Wednesday { get; set; } public decimal Thurday { get; set; } public decimal Friday { get; set; } public decimal Average { get; set; } public string DateRange { get; set; } } } In Controller-action[Data retrieving as Collection Of WeeklyReportModel] namespace SMS.UI.Controllers { public class ReportController : Controller { public ActionResult StoreWeeklyReport(string id) { DateTime weekStart, weekClose; string[] dateArray = id.Split('_'); weekStart = Convert.ToDateTime(dateArray[0].ToString()); weekClose = Convert.ToDateTime(dateArray[1].ToString()); SMS.Infrastructure.Report.AuditReport weeklyReport = new SMS.Infrastructure.Report.AuditReport(); IEnumerable<SMS.Domain.WeeklyReportModel> weeklyRpt = weeklyReport.ReportByStore().WeeklyReport(weekStart, weekClose); Session["WeeklyData"] = weeklyRpt; Response.Redirect("~/Reports/Weekly/StoreWeekly.aspx"); return View(); } } } Thanks in advance.

    Read the article

  • Oracle Primavera Partner Programs

    - by mark.kromer
    Here is the slide presentation with only the slides that can be shared at this time, for our Oracle Primavera partner programs focusing on expanding P6's workflows and reporting capabilities. By leveraging Oracle's BPM & BI Publisher products, you can build exciting new workflow & enhanced reports to expand the capabilities of Primavera applications.

    Read the article

  • Automate delivery of Crystal Reports With a Windows Service

    In this article, Vince demonstrates the creation of a Windows Service to automatically run and send a Crystal Report as an email attachment. After a basic introduction, he examines the creation of the database and windows service with the help of relevant source code and explanations. Towards the end of the article, Vince discusses the steps to be followed in order to install the windows service.

    Read the article

  • Alloy Navigator 6 Automates Reports Integrates with Exchange

    Alloy Software has released Alloy Navigator 6, an update to its Navigator integrated IT operations management application....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Update to kernel 3.12 seems to fail: uname reports old rc7

    - by carlo
    I currently run Xubuntu 13.10 with kernel 3.12 rc7. Today I tried updating to the latest 3.12 kernel (non-rc), but this seems to fail. When installing the image and headers I see the following error passing by: ... run-parts: executing /etc/kernel/postinst.d/dkms 3.12.0-031200-generic /boot/vmlinuz-3.12.0-031200-generichis Error! The dkms.conf for this module includes a BUILD_EXCLUSIVE directive which does not match this kernel/arch. This indicates that it should not be built. ... After rebooting, when I do uname -r or cat /proc/version it tells me that I'm still running on the old rc7 kernel. Since my microphone wasn't working on my Sony Vaio Pro 13 I did download and install the latest ALSA drivers using the oem-audio-hda-daily-dkms package which seem to fix the problem (with the mic). Maybe this has something to do with it? I also tried removing the package using sudo apt-get purge oem-audio-hda-daily-dkms but no success.

    Read the article

  • Ubuntu 13.10 Installing MariaDB when Apt reports MariaDB has unmet dependencies or broken packages

    - by Ecaz
    I have tried everything to install MariaDB on this clean Ubuntu installation but I keep getting this error, Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: mariadb-server : Depends: mariadb-server-5.5 (= 5.5.33a+maria-1~saucy) but it is not going to be installed E: Unable to correct problems, you have held broken packages. I have followed this guide to try and install it, http://www.unixmen.com/install-lemp-server-nginx-mysql-mariadb-php-ubuntu-13-10-server/ And I have also followed the "official" guide on the MariaDB downloads page for 13.10 https://downloads.mariadb.org/mariadb/repositories/ But nothing seems to be working. Edit 1 I have tried both How do I resolve unmet dependencies? and How to install MariaDB? but it still gives me the error I posted above. It's a fresh Ubuntu install with hardly anything installed. Edit 2 All the check boxes are ticket in Updates. I ran: sudo apt-get update && sudo apt-get -f install mariadb-server-5.5"=5.5.33a+maria-1~saucy" And it gave me this error: The following packages have unmet dependencies: mariadb-server-5.5 : Depends: mariadb-client-5.5 (>= 5.5.33a+maria-1~saucy) but it is not going to be installed Depends: mariadb-server-core-5.5 (>= 5.5.33a+maria-1~saucy) but it is not going to be installed E: Unable to correct problems, you have held broken packages.

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >