Noticias

Category: Noticias

Adaptadores de red ocultos

Some times our Azure Servers (Windows), that are shout down and deallocated every night, experienced connectivity issues when trying to access shared folders in the same Virtual Network or even connecting to the Active Directory.

I have noticed that every time an Azure VM is shout down and deallocated, as soon as it start up again a new network adapter is created and assigned to be used by the operating system. These network adapters can be found at the Device Manager under Network Adapters with a # at the end of their names.

That made me perform some research and found that the connectivity issue was caused because of “ghost” or hidden network adapters created in the past, every time the server was turned off (deallocated) and on again.

I have also consulted with Microsoft Support and they told me that whenever a server is shout down and deallocated, its resources (adapters) are freed to avoid overloading the node/host nonsense.

Then, when the server restarts, the resources are reallocated again without losing any configurations, and for this reason you get new network devices.

To prevent communication/network issues mentioned before, I had to go at least once a week to the device manager (since we have only Windows servers), select to show hidden devices and then uninstall all hidden/ghost network adapters.

Steps:

  • To open the Device Manager you can run these from a CMD as administrator:
    set devmgr_show_nonpresent_devices=1
    start devmgmt.msc
  • Once in the Device Manager go to View à Show Hidden Devices
  • Open Network Adapters section and delete/uninstall hidden adapters. They will appear grayed out.

Blog UDF Compilada

Contents

 

Introducción. 2

Por qué crear procedimientos sql compilados?. 3

Debug de funciones compiladas. 4

Restauración de base de datos. 6

Introducción

Las funciones definidas por el usuario son rutinas que pueden aceptar parámetros, realizar cálculos u otras acciones y devolver un resultado. Puede escribir funciones definidas por el usuario en cualquier lenguaje de programación de Microsoft .NET Framework, como Microsoft Visual Basic .NET o Microsoft Visual C#.

Fuente msdn microsoft : https://msdn.microsoft.com/es-es/library/ms254508(v=vs.110).aspx

Por qué crear procedimientos sql compilados?

 

Microsoft SQL Server proporciona compatibilidad con tipos definidos por el usuario (UDT) implementados con Common Language Runtime (CLR) de Microsoft .NET Framework.CLR se integra en SQL Server y este mecanismo permite ampliar el sistema de tipos de la base de datos. Los tipos definidos por el usuario proporcionan al usuario extensibilidad del sistema de tipos de datos de SQL Server, así como la capacidad para definir tipos estructurados complejos.

 

Desde la perspectiva de una arquitectura de aplicación, pueden proporcionar dos ventajas clave:

  • Sólido encapsulado (en el cliente y el servidor) entre el estado interno y los comportamientos externos.
  • Fuerte integración con otras características de servidor relacionadas.Una vez definidos sus propios tipos definidos por el usuario, puede utilizarlos en todos los contextos en los que pueda emplear un tipo de sistema de SQL Server, como definiciones de columnas, variables, parámetros, resultados de funciones, cursores, desencadenadores y replicación.

Fuente msdn microsoft : https://msdn.microsoft.com/es-es/library/ms254944(v=vs.110).aspx

Debug de funciones compiladas

 

Tener en cuenta al momento de realizar debug de funciones compiladas que debemos ajustar  las propiedades de debugging de visual studio. Para realizar dichos ajustes hacer click en “Debug” y luego “Options and Settings..”

Luego veremos en la sección “Debugging” -> “General” que tendremos chequeado por default “Enable Just My Code ( Managed only)” esta opción no debe estar chequeada.

En caso de tener chequeada esa opción lo que pasara es que al querer debuggear una función compilada solo mostrara el resultado de la misma sin poder hacer el step into, el motivo de este comportamiento es porque no es código nativo de c#.

Una vez realizado este ajuste es importante hacer right click sobre el test script que deseamos probar y click en “Set as Default Debug Script”. Luego para empezar el debug es importante hacer click en “Debug Script”.

 

buenosaires

Restauración de base de datos

 

Al restaurar una base de datos que contenga procedimientos compilados .net deberá tenerse en cuenta que es necesario eliminar los assemblies generados correspondientes a dichos procedimientos y volver a generarlos. A continuación se toma una base de datos de ejemplo y se muestran los pasos a seguir para realizar la restauración.

  1. Restaurar la base de datos
  1. Una vez restaurada la base debemos borrar los assemblies y volver a generarlos, para visualizar cuales son, podemos ir a programability -> Assemblies
  1. El primer paso es setear la base de datos con TRUSTWORTHY ON, esta instrucción sirve para indicar al motor de base de datos que puede confiar en los ensamblados y el contenido de la base de datos

ALTER DATABASE DB_NAME SET TRUSTWORTHY ON

 

  1. El Segundo paso es borrar los assemblies y sus dependencias. Es importante que se ejecute en este orden ya que si no borramos primero los stored procedures y funciones compilados, no podremos borrar los ensamblados.

 

DROP PROCEDURE Calculation_SP1

DROP PROCEDURE Calculation_SP2

DROP FUNCTION  UDF1

DROP assembly [Example.SP.SQL.Calculation]

DROP assembly [System.Data.Entity]

DROP assembly [System.Runtime.Serialization]

DROP ASSEMBLY [SMDiagnostics]

DROP assembly [System.ComponentModel.DataAnnotations]

 

  1. Hacemos click derecho sobre assemblies, refresh y no deberíamos ver ningún assemblie relacionado con las funciones compiladas

 

  1. Ahora debemos volver a generar los assemblies borrados en el paso anterior:

 

CREATE ASSEMBLY [SMDiagnostics]

authorization [dbo] /* puede darse que sea [linksis] */

from ‘C:\Windows\Microsoft.NET\Framework64\v3.0\Windows Communication Foundation\SMDiagnostics.dll’

with permission_set = unsafe

 

create assembly [System.Runtime.Serialization]

authorization [dbo] /* puede darse que sea [linksis] */

from ‘C:\Windows\Microsoft.NET\Framework\v3.0\Windows Communication Foundation\system.runtime.serialization.dll’

with permission_set = unsafe

 

create assembly [System.Data.Entity]

authorization [dbo] /* puede darse que sea [linksis] */

from ‘C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.5\system.data.entity.dll’

with permission_set = unsafe

 

create assembly [System.ComponentModel.DataAnnotations]

authorization [dbo] /* puede darse que sea [linksis] */

from ‘C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.5\system.componentmodel.dataannotations.dll’

with permission_set = unsafe

 

  1. Ahora debemos volver a generar los assemblies correspondientes a nuestro proyecto Ejemplo.Model.dll y Ejemplo.SQL.CalculationEngine.dll.

 

Nota : Estas dll corresponden a nuestro proyecto

 

DECLARE @AssemblyPath nvarchar(255)

 

create assembly [Ejemplo.Model]

authorization [dbo]

from @AssemblyPath + ‘ Ejemplo.Model.dll’

with permission_set = unsafe

 

create assembly [Ejemplo.CalculationEngine]

authorization [dbo]

from @AssemblyPath + ‘Ejemplo.CalculationEngine.dll’

with permission_set = unsafe

 

  1. Una vez creados los assemblies, procedemos a crear los stored procedures y funciones compiladas.

 

CREATE PROCEDURE [dbo].[Calculation_SP1]

WITH EXECUTE AS CALLER

AS

EXTERNAL NAME [Calculation].[StoredProcedures].[SP1]

GO

EXEC sys.sp_addextendedproperty @name=N’AutoDeployed’, @value=N’yes’ , @level0type=N’SCHEMA’,@level0name=N’dbo’, @level1type=N’PROCEDURE’,@level1name=N’SP_CPER_CalculationEngine_Execute’

EXEC sys.sp_addextendedproperty @name=N’SqlAssemblyFile’, @value=N’StoredProcedures.cs’ , @level0type=N’SCHEMA’,@level0name=N’dbo’, @level1type=N’PROCEDURE’,@level1name=N’SP_CPER_CalculationEngine_Execute’

EXEC sys.sp_addextendedproperty @name=N’SqlAssemblyFileLine’, @value=16 , @level0type=N’SCHEMA’,@level0name=N’dbo’, @level1type=N’PROCEDURE’,@level1name=N’Calculation_SP1′

GO

CREATE FUNCTION [dbo].[UDF_1](@Param [int])

RETURNS [float] WITH EXECUTE AS CALLER

AS

EXTERNAL NAME [Calculation].[UserDefinedFunctions].[UDF_1]

GO

EXEC sys.sp_addextendedproperty @name=N’AutoDeployed’, @value=N’yes’ , @level0type=N’SCHEMA’,@level0name=N’dbo’, @level1type=N’FUNCTION’,@level1name=N’UDF_CPER_GetValueByMatrixID’

EXEC sys.sp_addextendedproperty @name=N’SqlAssemblyFile’, @value=N’UserDefinedFunctions.cs’ , @level0type=N’SCHEMA’,@level0name=N’dbo’, @level1type=N’FUNCTION’,@level1name=N’UDF_CPER_GetValueByMatrixID’

EXEC sys.sp_addextendedproperty @name=N’SqlAssemblyFileLine’, @value=10 , @level0type=N’SCHEMA’,@level0name=N’dbo’, @level1type=N’FUNCTION’,@level1name=N’UDF_1′

 

  1. Por ultimo para verificar que las rutinas compiladas están funcionando correctamente podemos ejecutar un stored procedure compilado desde sql management por ejemplo.

 

EXEC dbo.Ejemplo.GetCustomers

Sybase IQ – Enlazar servicios de la empresa

Sybase IQ – Enlazar servidores externos

 

Como enlazar servidores de datos externos y utilizar tablas remotas

 

Crear conexión a servidor Oracle

Ingresar al asistente por la opción New Remote Server.

Asistente:

Elegir tipo de servidor

Mecanismo de conexión:

Seleccionar si la conexión será de solo lectura

vb

Información de ususario

Revisión comando creación y ejecución

 

Crear tabla remota

Seleccionar nueva Proxy Table

Elegir la conexión

Escribir el esquema de ser necesario.

Seleccionar la tabla

Nuevas funciones en Oracle Planning and Budgeting Cloud Service (PBCS)

Oracle will add new features in the upcoming February 17th, 2017 update of PBCS which include:

  • Human Capital Management Integration with Enterprise Planning and Budgeting Cloud Workforce Planning
  • Updated Options for Business Rules Properties
  • Ability to Use Planning Formula Expressions in Calculation Manager
  • Automating the Downloading of Activity Reports
  • Single Sign-On with Oracle Fusion Applications
  • Define Valid Intersections-Based on Attribute
  • Showing Approval Units as Aliases in Approvals Notifications
  • New EPM Automate Utility Version that Imports IDM Artifacts
  • New Smart View Version with Support for Alternate Hierarchy
  • Indicator in Grid Cells for Related Content

The latest update also fixed some defects and issues from previous PBCS version, including:

  • The HTML view of a Financial Reporting book now correctly loads Financial Reporting reports.
  • The threshold limit of Year dimension has been increased to help customers extend planning model portfolios.
  • You are now able to run business rules even if the member name contains a quotation mark.
  • When you launch a rule in Planning, the rule no longer automatically uses the parameters that you used for the last run unless you select the Launch Last Value option in the dialog box for specifying runtime parameters.
  • The rundatarule command of the EPM Automate utility now executes correctly without errors.

The full document detailing new features and defects fixed can be viewed at click here.

SAP lanzó la nueva versión de su suite de productos ERP basada en cloud S / 4HANA

SAP unveiled the newest advances to its cloud ERP suite, SAP S/4HANA, featuring new in-memory technology, machine learning, contextual analytics, digital assistant and Fiori (applications). The launch comes as SAP tries to compete with Oracle and Salesforce who recently added AI and machine learning to their applications.
The product suite offers three versions:
• a Professional Services Cloud focused on project management,
• a Finance Cloud including procurement and order management capabilities, and
• a Enterprise Management Cloud for “comprehensive real-time business management”.

The offering is hosted in SAP data centers and in the future the company plans to support other public cloud providers like Microsoft Azure, Amazon Web Services, and others.
SAP plans to focus more on machine learning, AI and plans to include blockchain and IoT capabilities in the near future.

Read more:
SAP Unveils Next-Generation, Intelligent ERP with SAP S/4HANA Cloud
SAP S/4HANA: 10 Questions Answered

Tableau lidera el Gartner’s Magic Quadrant por quinto año consecutivo

Gartner recently released its 2017 BI Magic Quadrant which reinforces the Gartner’s view that BI is now focused on visual data discovery, as evidenced by Tableau, Microsoft and Qlik’s continued leadership in the ranking:

Tableau 1

Looking at last year’s ranking, we can see how Microsoft and Tableau have broken from the pack, and how Microsoft is catching up with Tableau. On the other hand, Qlik stepped back as the company transitioned QlikView to the new QlikSense platform.
Considering that Gartner stated that “by 2020, natural-language generation and artificial intelligence will be a standard feature of 90% of modern BI platforms”, Microsoft will continue to further its lead given its investment in both areas (within PowerBI and outside of it).
“What is new this year, is that traditional BI vendors that were slow to adjust to the “modern wave of disruption” (such as IBM, SAP, Oracle and MicroStrategy) and struggled to remain relevant during the market transition, have finally matured their modern offerings enough to appeal to many in their installed bases already using these”, Gartner stated.

Tableau 2

Read more:
https://www.tableau.com/about/blog/2017/2/tableau-five-years-leader-gartners-magic-quadrant-analytics-66133

Interested in learning how BI tools can help your firm? Contact us today or read more about our expertise in Tableau  and PowerBI.

Nube

Is speed everything?

Cloud hosting is often deemed the panacea for all infrastructure problems: no CAPEX, simple cost structure, no obsolescence, faster load speeds, flexible and scalable. However, recent problems with some cloud hosting providers (see Amazon AWS S3 outage is breaking things for a lot of websites and apps) forced some cloud user to define (or redefine) a strategy for their cloud (and on-premise) infrastructure. Regardless of the advantages of cloud infrastructures, a cookie cutter approach to defining a cloud infrastructure won’t be optimal for most firms and could expose them to potential problems; including outages, unforeseen costs, lag due to incompatible systems, lack of redundancy, unnecessary complexity, and problems arising from shared workload. Speed Faster load times and increased processing power lead to bigger profits. Faster loading websites will lead to more people visiting and staying at a website (with as many as 40% of users leaving websites if they fail to load within 3 seconds), while increasing processing power will allow you to analyze data faster as well as process more forms and requests quicker. Adding more processors, memory and storage can help firms improve the performance of their IT processes. However, static performance won’t be the only thing affecting your processes, so several other factors should be considered when implementing a cloud strategy. Before migrating part or all of your infrastructure to the cloud, a firm need to understand their business process and what IT services are affected. Different strategies should be used depending on weather the firm seeks to move its entire infrastructure to the cloud, use it for computing power, host applications or data or simply use an online software service.

Security A cloud provider should follow all the security procedures you would implement on your on-site servers, including installing firewalls, antivirus, multifactor user authentication, conducting background checks of its employees and protecting the data center (from natural disasters, fires and power outages). The security of network-to-network connections should also be analyzed to ensure data security. Redundant systems and facilities should be implemented for mission critical processes and to protect your firm in case of data losses. Some providers will compensate clients in case of data losses, but to ensure business continuity, redundancies should be placed to mitigate data loss risk. Support services offered by providers should also be analyzed, not only as they provide assistance during outages, but also since compatibility issues may arise with certain programs. All relevant IT and business factors should be considered when choosing a cloud provider and defining an IT strategy. Speed will get you business, but security and robustness will keep you in business. Inttao offers full IT and cloud consulting services. You can check out the services we offer here (poner link a seccion de cloud services). Or contact us today to see how a well-defined and implemented cloud strategy can reduce your costs and help grow your business.

Bancos

How analytics are helping manage risks, automate GRC and drive value creation

Banks and other financial institutions are exposed to a myriad of problems. From adhering to regulations and controlling risks to keeping ahead of competitors and offering new products and services, banks and other financial institutions need tools to alert them of impending problems, monitor their assets in real time, and get the right decision making tools to stay ahead of the market. In the following presentation you can learn how we developed a unified toolset that allows them to unlock hidden value.