Home New Trending Search
About Privacy Terms
#
#AmazonDynamodb
Posts tagged #AmazonDynamodb on Bluesky
Preview
AWS Glue zero-ETL integrations with Amazon DynamoDB as the source support new configurations AWS Glue zero-ETL now supports configurable change data capture (CDC) refresh intervals and on-demand data ingestion for integrations with Amazon DynamoDB as the source. This enhancement can help you to customize how frequently data changes are captured from your Amazon DynamoDB tables, with refresh intervals ranging from 15 minutes to 6 days, and trigger immediate data ingestion when needed. These capabilities bring zero-ETL integrations from Amazon DynamoDB sources to feature parity with zero-ETL integrations from SaaS sources, like Salesforce, SAP, and ServiceNow, ensuring consistent functionality across different source types. With configurable CDC refresh intervals, you can optimize your data pipeline performance by adjusting the frequency of change capture to match your specific business requirements—whether you need near real-time updates every 15 minutes or can work with longer intervals up to 6 days to reduce costs. The on-demand ingestion capability allows you to immediately capture critical data changes without waiting for the next scheduled CDC interval. This functionality is ideal for scenarios that require data to be immediately available for analytics, reporting, or downstream applications and helps strike a balance between data freshness requirements and operational efficiency. These features are available today in all AWS regions where AWS Glue zero-ETL is supported. To get started with configuring CDC refresh intervals and on-demand ingestion for your Amazon DynamoDB integrations, see the AWS Glue User Guide. To learn more about AWS Glue zero-ETL integrations, visit the AWS Glue documentation.

🆕 AWS Glue zero-ETL now supports CDC refresh intervals and on-demand data ingestion for Amazon DynamoDB, offering customizable data capture from 15 min to 6 days, ensuring feature parity with SaaS sources and optimizing pipeline performance. Available globally.

#AWS #AwsGlue #AmazonDynamodb

1 0 0 0
AWS Glue zero-ETL integrations with Amazon DynamoDB as the source support new configurations AWS Glue zero-ETL now supports configurable change data capture (CDC) refresh intervals and on-demand data ingestion for integrations with Amazon DynamoDB as the source. This enhancement can help you to customize how frequently data changes are captured from your Amazon DynamoDB tables, with refresh intervals ranging from 15 minutes to 6 days, and trigger immediate data ingestion when needed. These capabilities bring zero-ETL integrations from Amazon DynamoDB sources to feature parity with zero-ETL integrations from SaaS sources, like Salesforce, SAP, and ServiceNow, ensuring consistent functionality across different source types. With configurable CDC refresh intervals, you can optimize your data pipeline performance by adjusting the frequency of change capture to match your specific business requirements—whether you need near real-time updates every 15 minutes or can work with longer intervals up to 6 days to reduce costs. The on-demand ingestion capability allows you to immediately capture critical data changes without waiting for the next scheduled CDC interval. This functionality is ideal for scenarios that require data to be immediately available for analytics, reporting, or downstream applications and helps strike a balance between data freshness requirements and operational efficiency. These features are available today in all AWS regions where AWS Glue zero-ETL is supported. To get started with configuring CDC refresh intervals and on-demand ingestion for your Amazon DynamoDB integrations, see the AWS Glue https://docs.aws.amazon.com/glue/latest/dg/zero-etl-configuring-integration.html. To learn more about AWS Glue zero-ETL integrations, visit the AWS Glue https://docs.aws.amazon.com/glue/latest/dg/zero-etl-using.html. 

AWS Glue zero-ETL integrations with Amazon DynamoDB as the source support new configurations

AWS Glue zero-ETL now supports configurable change data capture (CDC) refresh intervals and on-demand data ingestion for integrations with Amazon DynamoDB as the source. ...

#AWS #AwsGlue #AmazonDynamodb

0 0 0 0
AWS expands Resource Control Policies support to Amazon DynamoDB AWS Resource Control Policies (RCPs) now support https://aws.amazon.com/dynamodb/?trk=292f3097-d805-46ec-9150-f185787f6c21&trk=292f3097-d805-46ec-9150-f185787f6c21&sc_channel=ps&sc_channel=ps&ef_id=EAIaIQobChMIhpqQ-oHDkgMVmTStBh0g5wH3EAAYASAAEgI5QfD_BwE:G:s&s_kwcid=AL!4422!3!651751059996!e!!g!!amazon%20dynamodb!19852662209!145019198137&gclid=EAIaIQobChMIhpqQ-oHDkgMVmTStBh0g5wH3EAAYASAAEgI5QfD_BwE . RCPs are a type of organization policy that you can use to manage permissions in your organization. RCPs offer central control over the maximum available permissions for resources in your organization. With this expansion, you can now use RCPs to manage permissions for Amazon DynamoDB. For example, you can create policies that prevent identities outside your organization from accessing DynamoDB, helping you build a https://aws.amazon.com/identity/data-perimeters-on-aws/ and enforce baseline security standards across your AWS environment.   RCPs are available in all AWS commercial Regions and AWS GovCloud (US) Regions. To learn more about RCPs and view the full list of supported AWS services, visit the https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_rcps.html in the AWS Organizations User Guide.

AWS expands Resource Control Policies support to Amazon DynamoDB

AWS Resource Control Policies (RCPs) now support aws.amazon.com/dynamodb/

#AWS #AmazonDynamodb

0 0 0 0
Amazon DynamoDB global tables now support replication across multiple AWS accounts Amazon DynamoDB global tables now support replication across multiple AWS accounts. DynamoDB global tables is a fully managed, serverless, multi-Region, and multi-active database used by tens of thousands of customers to power business-critical applications. With this new capability, you can replicate tables across AWS accounts and Regions to improve resiliency, isolate workloads at the account level, and apply distinct security and governance controls. For multi-account global tables, DynamoDB automatically replicates tables across AWS accounts and Regions. This capability allows you to strengthen fault tolerance and helps ensure applications remain highly available even during account-level disruptions, while allowing customers to align data placement with organizational and security requirements. Multi-account global tables are ideal for customers that adopt multi-account strategies or use AWS Organizations to improve security isolation, enforce data perimeter guardrails, implement disaster recovery (DR), or separate workloads by business unit. Multi-account global tables is available in https://aws.amazon.com/about-aws/global-infrastructure/regional-product-services/ and is billed according to existing https://aws.amazon.com/dynamodb/pricing/. To get started, see the https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/GlobalTables.html, and visit the https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/globaltables-MultiAccount.html to learn more about the benefits of using a multi-account strategy for your AWS environment.

Amazon DynamoDB global tables now support replication across multiple AWS accounts

Amazon DynamoDB global tables now support replication across multiple AWS accounts. DynamoDB global tables is a fully managed, serverless, multi-Region, and multi-active database used by te...

#AWS #AmazonDynamodb

0 0 0 0
Preview
Amazon DynamoDB global tables now support replication across multiple AWS accounts Amazon DynamoDB global tables now support replication across multiple AWS accounts. DynamoDB global tables is a fully managed, serverless, multi-Region, and multi-active database used by tens of thousands of customers to power business-critical applications. With this new capability, you can replicate tables across AWS accounts and Regions to improve resiliency, isolate workloads at the account level, and apply distinct security and governance controls. For multi-account global tables, DynamoDB automatically replicates tables across AWS accounts and Regions. This capability allows you to strengthen fault tolerance and helps ensure applications remain highly available even during account-level disruptions, while allowing customers to align data placement with organizational and security requirements. Multi-account global tables are ideal for customers that adopt multi-account strategies or use AWS Organizations to improve security isolation, enforce data perimeter guardrails, implement disaster recovery (DR), or separate workloads by business unit. Multi-account global tables is available in all AWS Regions and is billed according to existing global tables pricing. To get started, see the DynamoDB global tables documentation, and visit the AWS developer guide to learn more about the benefits of using a multi-account strategy for your AWS environment.

🆕 Amazon DynamoDB global tables now support replication across multiple AWS accounts, enhancing resiliency and security isolation, and is available in all regions. This feature is ideal for multi-account strategies and disaster recovery.

#AWS #AmazonDynamodb

0 0 0 0
Amazon DynamoDB global tables with multi-Region strong consistency now supports application resiliency testing with AWS Fault Injection Service Amazon https://aws.amazon.com/dynamodb/ global tables with multi-Region strong consistency (MRSC) now supports application resiliency testing with AWS https://aws.amazon.com/fis/ (FIS), a fully managed service for running controlled fault injection experiments to improve application performance, observability, and resilience. With this launch, you can create real-world failure scenarios to MRSC global tables, such as during regional failures, enabling you to observe how your applications respond to these disruptions and validate your resilience mechanisms. MRSC global tables replicate your DynamoDB tables automatically across your choice of AWS Regions to achieve fast, strongly consistent read and write performance, providing you https://aws.amazon.com/dynamodb/sla/, increased application resiliency, and improved business continuity. FIS is a fully managed service for running controlled fault injection experiments to improve an application’s performance, observability, and resilience. You can use the new FIS action to observe how their application responds to a pause in regional replication and tune their monitoring and recovery process to improve resiliency and application availability. MRSC global tables support for FIS is available in the following https://aws.amazon.com/about-aws/global-infrastructure/regional-product-services/: US East (N. Virginia), US East (Ohio), US West (Oregon), Asia Pacific (Tokyo), Asia Pacific (Osaka), Asia Pacific (Seoul), Europe (Ireland), Europe (London), Europe (Frankfurt), and Europe (Paris). To get started, visit the https://docs.aws.amazon.com/fis/latest/userguide/fis-actions-reference.html#dynamodb-actions-reference.

Amazon DynamoDB global tables with multi-Region strong consistency now supports application resiliency testing with AWS Fault Injection Service

Amazon https://aws.amazon.com/dynamodb/ global tables with multi-Region strong consistency (MRSC) now supports application resil...

#AWS #AmazonDynamodb

0 0 0 0
Preview
Amazon DynamoDB global tables with multi-Region strong consistency now supports application resiliency testing with AWS Fault Injection Service Amazon DynamoDB global tables with multi-Region strong consistency (MRSC) now supports application resiliency testing with AWS Fault Injection Service (FIS), a fully managed service for running controlled fault injection experiments to improve application performance, observability, and resilience. With this launch, you can create real-world failure scenarios to MRSC global tables, such as during regional failures, enabling you to observe how your applications respond to these disruptions and validate your resilience mechanisms. MRSC global tables replicate your DynamoDB tables automatically across your choice of AWS Regions to achieve fast, strongly consistent read and write performance, providing you 99.999% availability, increased application resiliency, and improved business continuity. FIS is a fully managed service for running controlled fault injection experiments to improve an application’s performance, observability, and resilience. You can use the new FIS action to observe how their application responds to a pause in regional replication and tune their monitoring and recovery process to improve resiliency and application availability. MRSC global tables support for FIS is available in the following AWS Regions: US East (N. Virginia), US East (Ohio), US West (Oregon), Asia Pacific (Tokyo), Asia Pacific (Osaka), Asia Pacific (Seoul), Europe (Ireland), Europe (London), Europe (Frankfurt), and Europe (Paris). To get started, visit the DynamoDB FIS actions documentation.

🆕 Amazon DynamoDB MRSC now supports AWS FIS for testing resiliency during regional failures, enabling you to observe application responses and improve performance, observability, and resilience. Available in multiple regions. Visit DynamoDB FIS documentation to start.

#AWS #AmazonDynamodb

1 0 0 0
AWS Databases are now available on v0 by Vercel Amazon Aurora PostgreSQL, Amazon Aurora DSQL, and Amazon DynamoDB serverless databases are now available on v0 by Vercel, an AI-powered tool that transforms your ideas into production-ready, full-stack web applications in minutes. With this launch, you can build your ideas as well as create and connect to AWS databases from v0 using natural language prompts. To get started, simply describe what you want to build in v0. The tool takes care of developing the frontend user interface and backend logic, storing your application data in the AWS database that best meets your application needs. v0 provides an end-to-end setup experience where you can choose and configure database resources under a new AWS account or link to an existing account, all without leaving v0 interface. New AWS accounts from Vercel include access to all three databases and $100 USD in credits that can be used with any of the database options for up to six months. You can also manage your plan, add payment information, and view usage details anytime by visiting the AWS settings portal from the Vercel dashboard. To learn more, visit http://v0.app or the https://vercel.com/marketplace/aws on the Vercel Marketplace. The serverless options for https://aws.amazon.com/rds/aurora/, https://aws.amazon.com/rds/aurora/dsql/, and https://aws.amazon.com/dynamodb/?nc2=type_a do not require infrastructure management and reduce costs by scaling down to zero automatically when not in use. You can create a database in the following https://aws.amazon.com/about-aws/global-infrastructure/regional-product-services/: US East (N. Virginia), US East (Ohio), US West (Oregon), Europe (Ireland), Europe (Frankfurt), Asia Pacific (Tokyo), and Asia Pacific (Mumbai). AWS Databases deliver security, reliability, and price performance without the operational overhead, whether you're prototyping your next big idea or running production AI and data driven applications. For more information, visit the https://aws.amazon.com/products/databases/

AWS Databases are now available on v0 by Vercel

Amazon Aurora PostgreSQL, Amazon Aurora DSQL, and Amazon DynamoDB serverless databases are now available on v0 by Vercel, an AI-powered tool that transforms your ideas into production-ready, full-stack web appl...

#AWS #AmazonDynamodb #AmazonAurora

1 0 0 0
Preview
AWS Databases are now available on v0 by Vercel Amazon Aurora PostgreSQL, Amazon Aurora DSQL, and Amazon DynamoDB serverless databases are now available on v0 by Vercel, an AI-powered tool that transforms your ideas into production-ready, full-stack web applications in minutes. With this launch, you can build your ideas as well as create and connect to AWS databases from v0 using natural language prompts. To get started, simply describe what you want to build in v0. The tool takes care of developing the frontend user interface and backend logic, storing your application data in the AWS database that best meets your application needs. v0 provides an end-to-end setup experience where you can choose and configure database resources under a new AWS account or link to an existing account, all without leaving v0 interface. New AWS accounts from Vercel include access to all three databases and $100 USD in credits that can be used with any of the database options for up to six months. You can also manage your plan, add payment information, and view usage details anytime by visiting the AWS settings portal from the Vercel dashboard. To learn more, visit v0 or the AWS landing page on the Vercel Marketplace. The serverless options for Amazon Aurora PostgreSQL, Amazon Aurora DSQL, and Amazon DynamoDB do not require infrastructure management and reduce costs by scaling down to zero automatically when not in use. You can create a database in the following AWS Regions: US East (N. Virginia), US East (Ohio), US West (Oregon), Europe (Ireland), Europe (Frankfurt), Asia Pacific (Tokyo), and Asia Pacific (Mumbai). AWS Databases deliver security, reliability, and price performance without the operational overhead, whether you're prototyping your next big idea or running production AI and data driven applications. For more information, visit the AWS Databases webpage.

🆕 AWS Databases like Aurora PostgreSQL and DynamoDB are now on Vercel's v0, an AI tool for fast full-stack app dev. It connects to AWS, scales serverlessly, and offers $100 in credits for new accounts. Manage settings via the Vercel dashboard.

#AWS #AmazonDynamodb #AmazonAurora

2 0 0 0
AWS Databases are now available on the Vercel Marketplace Today, AWS Databases including Amazon Aurora PostgreSQL, Amazon Aurora DSQL, and Amazon DynamoDB are generally available on the Vercel Marketplace, enabling you to create and connect to an AWS database directly from Vercel in seconds. To get started, you can create a new AWS Account from Vercel that includes access to the three databases and $100 USD in credits. These credits can be used with any of these database option for up to six months. Once your account is set up, you can have a production-ready Aurora database or DynamoDB table powering your Vercel projects within seconds. You can also manage your plan, add payment information, and view usage details anytime by visiting the AWS settings portal from the Vercel dashboard. To learn more, visit the https://vercel.com/marketplace/aws on the Vercel Marketplace. The integration includes serverless options for https://aws.amazon.com/rds/aurora/, https://aws.amazon.com/rds/aurora/dsql/, and https://aws.amazon.com/dynamodb/?nc2=type_a to simplify your application needs and reduce costs by scaling to zero when not in use. You can create a database in the following https://aws.amazon.com/about-aws/global-infrastructure/regional-product-services/: US East (N. Virginia), US East (Ohio), US West (Oregon), Europe (Ireland), Europe (Frankfurt), Asia Pacific (Tokyo), and Asia Pacific (Mumbai) with more Regions coming soon. AWS Databases deliver security, reliability, and price performance without the operational overhead, whether you're prototyping your next big idea or running production AI and data driven applications. For more information, visit the https://aws.amazon.com/products/databases/

AWS Databases are now available on the Vercel Marketplace

Today, AWS Databases including Amazon Aurora PostgreSQL, Amazon Aurora DSQL, and Amazon DynamoDB are generally available on the Vercel Marketplace, enabling you to create ...

#AWS #AmazonDynamodb #AmazonAurora #AwsDatabaseMigrationService

2 0 0 0
Preview
AWS Databases are now available on the Vercel Marketplace Today, AWS Databases including Amazon Aurora PostgreSQL, Amazon Aurora DSQL, and Amazon DynamoDB are generally available on the Vercel Marketplace, enabling you to create and connect to an AWS database directly from Vercel in seconds. To get started, you can create a new AWS Account from Vercel that includes access to the three databases and $100 USD in credits. These credits can be used with any of these database option for up to six months. Once your account is set up, you can have a production-ready Aurora database or DynamoDB table powering your Vercel projects within seconds. You can also manage your plan, add payment information, and view usage details anytime by visiting the AWS settings portal from the Vercel dashboard. To learn more, visit the AWS landing page on the Vercel Marketplace. The integration includes serverless options for Amazon Aurora PostgreSQL, Amazon Aurora DSQL, and Amazon DynamoDB to simplify your application needs and reduce costs by scaling to zero when not in use. You can create a database in the following AWS Regions: US East (N. Virginia), US East (Ohio), US West (Oregon), Europe (Ireland), Europe (Frankfurt), Asia Pacific (Tokyo), and Asia Pacific (Mumbai) with more Regions coming soon. AWS Databases deliver security, reliability, and price performance without the operational overhead, whether you're prototyping your next big idea or running production AI and data driven applications. For more information, visit the AWS Databases webpage.

🆕 AWS Databases like Aurora and DynamoDB are now on Vercel Marketplace for quick setup and a $100 six-month credit. Manage and scale serverless databases across regions with no overhead. Visit AWS on Vercel for details.

#AWS #AmazonDynamodb #AmazonAurora #AwsDatabaseMigrationService

2 0 0 0
Preview
Amazon DynamoDB now supports multi-attribute composite keys in global secondary indexes Amazon DynamoDB now supports primary keys composed of up to eight attributes in global secondary indexes (GSIs). While previously, partition and sort keys were limited to one attribute each, DynamoDB now supports up to four attributes each for the partition and sort keys. With multi-attribute keys, you no longer need to manually concatenate values into synthetic keys, which sometimes result in the need to backfill data before adding new indexes. Instead, you can create primary keys using up to eight existing attributes, making it easier to model diverse access patterns and adapt to new query requirements. Multi-attribute partition keys improve data distribution and uniqueness. Multi-attribute sort keys enable flexible querying by letting you specify conditions on sort key attributes from left to right. For example, an index with partition key UserId and sort key attributes Country, State, and City lets you query all locations for a user, then narrow results by Country, State, or City. Multi-attribute partition and sort keys are available at no additional charge in all AWS Regions where DynamoDB is available. You can create them using the AWS Management Console, AWS CLI, AWS SDKs, or DynamoDB API. To learn more, see Global Secondary Indexes in the Amazon DynamoDB Developer Guide.

🆕 Amazon DynamoDB now supports multi-attribute composite keys in GSIs, allowing up to eight attributes for partition and sort keys, simplifying data modeling and querying without synthetic keys. Available at no extra cost in all regions.

#AWS #AwsGovcloudUs #AmazonDynamodb

2 0 0 0
Amazon DynamoDB now supports multi-attribute composite keys in global secondary indexes Amazon DynamoDB now supports primary keys composed of up to eight attributes in global secondary indexes (GSIs). While previously, partition and sort keys were limited to one attribute each, DynamoDB now supports up to four attributes each for the partition and sort keys. With multi-attribute keys, you no longer need to manually concatenate values into synthetic keys, which sometimes result in the need to backfill data before adding new indexes. Instead, you can create primary keys using up to eight existing attributes, making it easier to model diverse access patterns and adapt to new query requirements. Multi-attribute partition keys improve data distribution and uniqueness. Multi-attribute sort keys enable flexible querying by letting you specify conditions on sort key attributes from left to right. For example, an index with partition key UserId and sort key attributes Country, State, and City lets you query all locations for a user, then narrow results by Country, State, or City. Multi-attribute partition and sort keys are available at no additional charge in all AWS Regions where DynamoDB is available. You can create them using the AWS Management Console, AWS CLI, AWS SDKs, or DynamoDB API. To learn more, see https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/GSI.html in the Amazon DynamoDB Developer Guide.

Amazon DynamoDB now supports multi-attribute composite keys in global secondary indexes

Amazon DynamoDB now supports primary keys composed of up to eight attributes in global secondary indexes (GSIs). While previously, partition and sort keys were limited t...

#AWS #AwsGovcloudUs #AmazonDynamodb

1 0 0 0
Amazon DynamoDB Streams expands AWS PrivateLink support to FIPS endpoints Amazon DynamoDB Streams now supports AWS PrivateLink for all available Amazon DynamoDB Streams Federal Information Processing Standard (FIPS) endpoints in US and Canada commercial AWS Regions. With this launch, you can establish a private connection between your virtual private cloud (VPC) and Amazon DynamoDB Streams FIPS endpoints instead of connecting over the public internet, helping you meet your organization's business, compliance, and regulatory requirements to limit public internet connectivity. Amazon DynamoDB Streams support for AWS PrivateLink FIPs endpoints is available with Amazon DynamoDB Streams in the US and Canada commercial AWS Regions: US East (N. Virginia), US East (Ohio), US West (N. California), US West (Oregon), Canada (Central), and Canada West (Calgary). To learn more about Amazon DynamoDB Streams support for AWS PrivateLink FIPs endpoints, visit thehttps://docs.aws.amazon.com/amazondynamodb/latest/developerguide/privatelink-streams.html. For more information about AWS PrivateLink and its benefits, visit the https://aws.amazon.com/privatelink/. 

Amazon DynamoDB Streams expands AWS PrivateLink support to FIPS endpoints

Amazon DynamoDB Streams now supports AWS PrivateLink for all available Amazon DynamoDB Streams Federal Information Processing Standard (FIPS) endpoints in US and Canada commercial AW...

#AWS #AwsPrivatelink #AmazonDynamodb

1 0 0 0
Preview
Amazon DynamoDB Streams expands AWS PrivateLink support to FIPS endpoints Amazon DynamoDB Streams now supports AWS PrivateLink for all available Amazon DynamoDB Streams Federal Information Processing Standard (FIPS) endpoints in US and Canada commercial AWS Regions. With this launch, you can establish a private connection between your virtual private cloud (VPC) and Amazon DynamoDB Streams FIPS endpoints instead of connecting over the public internet, helping you meet your organization's business, compliance, and regulatory requirements to limit public internet connectivity. Amazon DynamoDB Streams support for AWS PrivateLink FIPs endpoints is available with Amazon DynamoDB Streams in the US and Canada commercial AWS Regions: US East (N. Virginia), US East (Ohio), US West (N. California), US West (Oregon), Canada (Central), and Canada West (Calgary). To learn more about Amazon DynamoDB Streams support for AWS PrivateLink FIPs endpoints, visit the Amazon DynamoDB Stream documentation. For more information about AWS PrivateLink and its benefits, visit the AWS PrivateLink product page.

🆕 Amazon DynamoDB Streams now supports AWS PrivateLink for FIPS endpoints in US and Canada, enabling private VPC connections to meet compliance and regulatory needs, available in select regions.

#AWS #AwsPrivatelink #AmazonDynamodb

1 0 0 0
Amazon DynamoDB Accelerator now supports AWS PrivateLink Amazon DynamoDB Accelerator (DAX) now supports AWS PrivateLink, enabling you to securely access DAX management APIs such as CreateCluster, DescribeClusters, and DeleteCluster over private IP addresses within your virtual private cloud (VPC). DAX clusters already run inside your VPC, and all data plane operations like GetItem and Query are handled privately within the VPC. With this launch, you can now perform cluster management operations privately, without connecting to the public regional endpoint. With AWS PrivateLink, you can simplify private network connectivity between virtual private clouds (VPCs), DAX, and your on-premises data centers using interface VPC endpoints and private IP addresses. It helps you meet compliance regulations and eliminates the need to use public IP addresses, configure firewall rules, or configure an Internet gateway to access DAX from your on-premises data centers. AWS PrivateLink for DAX is available in all Regions where DAX is available today. For information about DAX Regional availability, see the “Service endpoints” section in https://docs.aws.amazon.com/general/latest/gr/ddb.html. There is an additional cost to use the feature. Please see https://aws.amazon.com/privatelink/pricing/ for more details. To get started with DAX and PrivateLink, see https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/dax-private-link.html.

Amazon DynamoDB Accelerator now supports AWS PrivateLink

Amazon DynamoDB Accelerator (DAX) now supports AWS PrivateLink, enabling you to securely access DAX management APIs such as CreateCluster, DescribeClusters, and DeleteCluster over private IP addresses within your vi...

#AWS #AmazonDynamodb

1 0 0 0
Preview
Amazon DynamoDB Accelerator now supports AWS PrivateLink Amazon DynamoDB Accelerator (DAX) now supports AWS PrivateLink, enabling you to securely access DAX management APIs such as CreateCluster, DescribeClusters, and DeleteCluster over private IP addresses within your virtual private cloud (VPC). DAX clusters already run inside your VPC, and all data plane operations like GetItem and Query are handled privately within the VPC. With this launch, you can now perform cluster management operations privately, without connecting to the public regional endpoint. With AWS PrivateLink, you can simplify private network connectivity between virtual private clouds (VPCs), DAX, and your on-premises data centers using interface VPC endpoints and private IP addresses. It helps you meet compliance regulations and eliminates the need to use public IP addresses, configure firewall rules, or configure an Internet gateway to access DAX from your on-premises data centers. AWS PrivateLink for DAX is available in all Regions where DAX is available today. For information about DAX Regional availability, see the “Service endpoints” section in Amazon DynamoDB endpoints and quotas. There is an additional cost to use the feature. Please see AWS PrivateLink pricing for more details. To get started with DAX and PrivateLink, see AWS PrivateLink for DAX.

🆕 Amazon DynamoDB Accelerator (DAX) now supports AWS PrivateLink for secure, private API access within VPCs, simplifying compliance and removing public IPs. Available in all DAX regions at an additional cost. For pricing, see AWS PrivateLink and DAX docs.

#AWS #AmazonDynamodb

1 0 0 0
Amazon DynamoDB zero-ETL integration with Amazon Redshift now available in the Asia Pacific (Taipei) region Amazon DynamoDB zero-ETL integration with Amazon Redshift is now supported in the Asia Pacific (Taipei) region. This expansion enables customers to run high-performance analytics on their DynamoDB data in Amazon Redshift with no impact on production workloads running on DynamoDB.  Zero-ETL integrations help you derive holistic insights across many applications, break data silos in your organization, and gain significant cost savings and operational efficiencies. Now you can run enhanced analysis on your DynamoDB data with the rich capabilities of Amazon Redshift, such as high performance SQL, built-in ML and Spark integrations, materialized views with automatic and incremental refresh, and data sharing. Additionally, you can use history mode to easily run advanced analytics on historical data, build lookback reports, and build Type 2 Slowly Changing Dimension (SCD 2) tables on your historical data from DynamoDB, out-of-the-box in Amazon Redshift, without writing any code. The Amazon DynamoDB zero-ETL integration with Amazon Redshift is now available in Asia Pacific (Taipei), in addition to previously supported regions. For a complete list of supported regions, please refer to the AWS Region Table where Amazon Redshift is available. To learn more, visit the getting started guides for https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/RedshiftforDynamoDB-zero-etl.html and https://docs.aws.amazon.com/redshift/latest/mgmt/zero-etl-using.html. For more information on using history mode, we encourage you to visit our recent blog post https://aws.amazon.com/blogs/big-data/amazon-redshift-announces-history-mode-for-zero-etl-integrations-to-simplify-historical-data-tracking-and-analysis/.

Amazon DynamoDB zero-ETL integration with Amazon Redshift now available in the Asia Pacific (Taipei) region

Amazon DynamoDB zero-ETL integration with Amazon Redshift is now supported in the Asia Pacific (Taipei) region. This expansion enables customers to ...

#AWS #AmazonDynamodb #AmazonRedshift

1 0 1 0
Preview
Amazon DynamoDB zero-ETL integration with Amazon Redshift now available in the Asia Pacific (Taipei) region Amazon DynamoDB zero-ETL integration with Amazon Redshift is now supported in the Asia Pacific (Taipei) region. This expansion enables customers to run high-performance analytics on their DynamoDB data in Amazon Redshift with no impact on production workloads running on DynamoDB.  Zero-ETL integrations help you derive holistic insights across many applications, break data silos in your organization, and gain significant cost savings and operational efficiencies. Now you can run enhanced analysis on your DynamoDB data with the rich capabilities of Amazon Redshift, such as high performance SQL, built-in ML and Spark integrations, materialized views with automatic and incremental refresh, and data sharing. Additionally, you can use history mode to easily run advanced analytics on historical data, build lookback reports, and build Type 2 Slowly Changing Dimension (SCD 2) tables on your historical data from DynamoDB, out-of-the-box in Amazon Redshift, without writing any code. The Amazon DynamoDB zero-ETL integration with Amazon Redshift is now available in Asia Pacific (Taipei), in addition to previously supported regions. For a complete list of supported regions, please refer to the AWS Region Table where Amazon Redshift is available. To learn more, visit the getting started guides for DynamoDB and Amazon Redshift. For more information on using history mode, we encourage you to visit our recent blog post here.

🆕 Amazon DynamoDB zero-ETL integration with Amazon Redshift is now available in Asia Pacific (Taipei), enabling high-performance analytics on DynamoDB data without impacting production workloads.

#AWS #AmazonDynamodb #AmazonRedshift

3 1 1 0
Amazon DynamoDB now supports Internet Protocol version 6 (IPv6) https://aws.amazon.com/dynamodb/ now offers customers the option to use Internet Protocol version 6 (IPv6) addresses in their Amazon Virtual Private Cloud (VPC) when connecting to DynamoDB tables, streams, and DynamoDB Accelerator (DAX), including with AWS PrivateLink Gateway and Interface endpoints. Customers moving to IPv6 can simplify their network stack and meet compliance requirements by using a network that supports both IPv4 and IPv6. The continued growth of the internet is exhausting available Internet Protocol version 4 (IPv4) addresses. IPv6 increases the number of available addresses by several orders of magnitude and customers no longer need to manage overlapping address spaces in their VPCs. Customers can standardize their applications on the new version of Internet Protocol by moving to IPv6 with a few clicks in the AWS Management Console. Support for IPv6 in Amazon DynamoDB is now available in all commercial AWS Regions in the United States and the AWS GovCloud (US) Regions. It will deploy to the remaining global AWS Regions where Amazon DynamoDB is available over the next few weeks. To connect to DynamoDB using IPv6 addresses and check regional availability, please see the https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/AccessingDynamoDB.html#dual-stackipv4-6 and the https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DAX.create-cluster.DAX_and_IPV6.html.

Amazon DynamoDB now supports Internet Protocol version 6 (IPv6)

https://aws.amazon.com/dynamodb/ now offers customers the option to use Internet Protocol version 6 (IPv6) addresses in their Amazon Virtual Private Cloud (VPC) when connecting to DynamoDB tables, streams, an...

#AWS #AmazonDynamodb

1 0 0 0
Preview
Amazon DynamoDB now supports Internet Protocol version 6 (IPv6) Amazon DynamoDB now offers customers the option to use Internet Protocol version 6 (IPv6) addresses in their Amazon Virtual Private Cloud (VPC) when connecting to DynamoDB tables, streams, and DynamoDB Accelerator (DAX), including with AWS PrivateLink Gateway and Interface endpoints. Customers moving to IPv6 can simplify their network stack and meet compliance requirements by using a network that supports both IPv4 and IPv6. The continued growth of the internet is exhausting available Internet Protocol version 4 (IPv4) addresses. IPv6 increases the number of available addresses by several orders of magnitude and customers no longer need to manage overlapping address spaces in their VPCs. Customers can standardize their applications on the new version of Internet Protocol by moving to IPv6 with a few clicks in the AWS Management Console. Support for IPv6 in Amazon DynamoDB is now available in all commercial AWS Regions in the United States and the AWS GovCloud (US) Regions. It will deploy to the remaining global AWS Regions where Amazon DynamoDB is available over the next few weeks. To connect to DynamoDB using IPv6 addresses and check regional availability, please see the DynamoDB developer guide and the DynamoDB Accelerator user guide.

🆕 Amazon DynamoDB now supports IPv6, allowing VPC connections and simplifying network stacks. IPv6 is available in all US regions and AWS GovCloud, with global rollout soon. For more, see the DynamoDB developer guide.

#AWS #AmazonDynamodb

2 0 0 0
AWS Weekly Roundup: Amazon EC2, Amazon Q Developer, IPv6 updates, and more (September 1, 2025) My LinkedIn feed was absolutely packed this week with pictures from the AWS Heroes Summit event in Seattle. It was heartwarming to see so many familiar faces and new Heroes coming together. For those not familiar with the AWS Heroes program, it’s a global community recognition initiative that honors individuals who make outstanding contributions to […]

AWS Weekly Roundup: Amazon EC2, Amazon Q Developer, IPv6 updates, and more (September 1, 2025)

My LinkedIn feed was absolutely packed this week with pictures from ...

#AWS #AmazonApiGateway #AmazonDynamodb #AmazonEc2 #AmazonEc2MacInstances #AwsClientVpn #AwsTrainingAndCertification #WeekInReview

1 0 0 0
AWS Weekly Roundup: OpenAI models, Automated Reasoning checks, Amazon EVS, and more (August 11, 2025) AWS Summits in the northern hemisphere have mostly concluded but the fun and learning hasn’t yet stopped for those of us in other parts of the globe. The community, customers, partners, and colleagues enjoyed a day of learning and networking last week at the AWS Summit Mexico City and the AWS Summit Jakarta. Last week’s […]

AWS Weekly Roundup: OpenAI models, Automated Reasoning checks, Amazon EVS, and more (August 11, 2025)

AWS Summits in the northern hemisphere have mo...

#AWS #AmazonBedrock #AmazonDynamodb #AmazonElasticVmwareService(AmazonEvs) #AmazonSimpleQueueService(Sqs) #AwsLambda #Launch #News #WeekInReview

1 0 0 0
AWS Weekly Roundup: Amazon EC2, Amazon Q Developer, IPv6 updates, and more (September 1, 2025) My LinkedIn feed was absolutely packed this week with pictures from the AWS Heroes Summit event in Seattle. It was heartwarming to see so many familiar faces and new Heroes coming together. For those not familiar with the AWS Heroes program, it’s a global community recognition initiative that honors individuals who make outstanding contributions to […]

AWS Weekly Roundup: Amazon EC2, Amazon Q Developer, IPv6 updates, and more (September 1, 2025)

My LinkedIn feed was absolutely packed this week with pictures from ...

#AWS #AmazonApiGateway #AmazonDynamodb #AmazonEc2 #AmazonEc2MacInstances #AwsClientVpn #AwsTrainingAndCertification #WeekInReview

1 0 0 0
Amazon DynamoDB now supports more granular throttle error exceptions DynamoDB now supports more granular throttling exceptions along with their corresponding Amazon CloudWatch metrics. The additional fields in the new throttling exceptions identify the specific resources and reasons for throttling events, making it easier to understand and diagnose throttling-related issues. You can see the new Amazon CloudWatch metrics immediately, and upon upgrading your SDK to the newest version, you will also see the new granular throttling exceptions. Every throttling exception now contains a list of reasons why the request was throttled, as well as the Amazon Resource Name (ARN) of the table or index that was throttled. These new throttle exception reasons help you understand why you were throttled and enable you to take corrective actions like adjusting your configured throughput, switching your table to on-demand capacity mode, or optimizing data access patterns. The more granular throttling exceptions and their respective metrics are available in all commercial AWS Regions, the AWS GovCloud (US) Regions, and the China Regions. To get started see the following list of resources: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/throttling-diagnosing-workflow.html in the DynamoDB developer guide https://aws.amazon.com/blogs/database/enhanced-throttling-observability-in-amazon-dynamodb/ blog post https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/TroubleshootingThrottling.html in the DynamoDB developer guide

Amazon DynamoDB now supports more granular throttle error exceptions

DynamoDB now supports more granular throttling exceptions along with their corresponding Amazon CloudWatch metrics. The additional fields in the new throttling exceptions identify the spec...

#AWS #AwsGovcloudUs #AmazonDynamodb

0 0 0 0
Amazon DynamoDB now supports a CloudWatch Contributor Insights mode exclusively for throttled keys DynamoDB now supports the ability to selectively emit events for throttled keys to CloudWatch Contributor Insights, enabling you to monitor throttled keys without emitting events for all accessed keys. By emitting events for throttled keys exclusively, you no longer need to pay for all of your successful request events. Cloudwatch Contributor Insights for DynamoDB can help you understand your traffic patterns by providing information about your most accessed and throttled keys in a table or global secondary index. This information can be used to understand your application usage patterns or diagnose throttling-related issues. By choosing to only emit events for throttled keys, you can reduce the amount you spend to receive these insights. The new mode to exclusively emit throttled key events to CloudWatch Contributor Insights is available in all commercial AWS Regions, the AWS GovCloud (US) Regions, and the China Regions. To get started, see the following list of resources: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/contributorinsights_HowItWorks.html in the DynamoDB developer guide https://aws.amazon.com/blogs/database/enhanced-throttling-observability-in-amazon-dynamodb/ blog post https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/TroubleshootingThrottling.html in the DynamoDB developer guide

Amazon DynamoDB now supports a CloudWatch Contributor Insights mode exclusively for throttled keys

DynamoDB now supports the ability to selectively emit events for throttled keys to CloudWatch Contributor Insights, enabling you to monitor throttled keys wit...

#AWS #AmazonDynamodb #AwsGovcloudUs

0 0 0 0
Preview
Amazon DynamoDB now supports a CloudWatch Contributor Insights mode exclusively for throttled keys DynamoDB now supports the ability to selectively emit events for throttled keys to CloudWatch Contributor Insights, enabling you to monitor throttled keys without emitting events for all accessed keys. By emitting events for throttled keys exclusively, you no longer need to pay for all of your successful request events. Cloudwatch Contributor Insights for DynamoDB can help you understand your traffic patterns by providing information about your most accessed and throttled keys in a table or global secondary index. This information can be used to understand your application usage patterns or diagnose throttling-related issues. By choosing to only emit events for throttled keys, you can reduce the amount you spend to receive these insights. The new mode to exclusively emit throttled key events to CloudWatch Contributor Insights is available in all commercial AWS Regions, the AWS GovCloud (US) Regions, and the China Regions. To get started, see the following list of resources: CloudWatch Contributor Insights for DynamoDB in the DynamoDB developer guide Enhanced throttling observability in Amazon DynamoDB blog post Troubleshooting throttling in the DynamoDB developer guide

🆕 Amazon DynamoDB now offers CloudWatch Contributor Insights for throttled keys, cutting costs by skipping successful request charges. This mode aids in monitoring throttling and traffic patterns, available in all commercial AWS regions. For details, see the Dy…

#AWS #AmazonDynamodb #AwsGovcloudUs

0 0 0 0
Preview
Amazon DynamoDB now supports more granular throttle error exceptions DynamoDB now supports more granular throttling exceptions along with their corresponding Amazon CloudWatch metrics. The additional fields in the new throttling exceptions identify the specific resources and reasons for throttling events, making it easier to understand and diagnose throttling-related issues. You can see the new Amazon CloudWatch metrics immediately, and upon upgrading your SDK to the newest version, you will also see the new granular throttling exceptions. Every throttling exception now contains a list of reasons why the request was throttled, as well as the Amazon Resource Name (ARN) of the table or index that was throttled. These new throttle exception reasons help you understand why you were throttled and enable you to take corrective actions like adjusting your configured throughput, switching your table to on-demand capacity mode, or optimizing data access patterns. The more granular throttling exceptions and their respective metrics are available in all commercial AWS Regions, the AWS GovCloud (US) Regions, and the China Regions. To get started see the following list of resources: Diagnosing throttling issues in the DynamoDB developer guide Enhanced throttling observability in Amazon DynamoDB blog post Troubleshooting throttling in the DynamoDB developer guide

🆕 Amazon DynamoDB now offers more detailed throttling exceptions with CloudWatch metrics, identifying specific resource throttling reasons for better diagnostics and corrective actions. Available in all commercial regions. See developer guides for more info.

#AWS #AwsGovcloudUs #AmazonDynamodb

0 0 0 0
AWS Weekly Roundup: OpenAI models, Automated Reasoning checks, Amazon EVS, and more (August 11, 2025) AWS Summits in the northern hemisphere have mostly concluded but the fun and learning hasn’t yet stopped for those of us in other parts of the globe. The community, customers, partners, and colleagues enjoyed a day of learning and networking last week at the AWS Summit Mexico City and the AWS Summit Jakarta. Last week’s […]

AWS Weekly Roundup: OpenAI models, Automated Reasoning checks, Amazon EVS, and more (August 11, 2025)

AWS Summits in the northern hemisphere have mo...

#AWS #AmazonBedrock #AmazonDynamodb #AmazonElasticVmwareService(AmazonEvs) #AmazonSimpleQueueService(Sqs) #AwsLambda #Launch #News #WeekInReview

0 0 0 0
Preview
Build the highest resilience apps with multi-Region strong consistency in Amazon DynamoDB global tables | Amazon Web Services Amazon DynamoDB now offers multi-Region strong consistency capability for global tables, providing the highest level of application resilience and enabling your applications to be always available.

📰🚨Build the highest resilience apps with multi-Region strong consistency in Amazon DynamoDB global tables by Donnie Prakoso

#AmazonDynamoDB #GlobalTables #MultiRegionConsistency #CloudComputing #DataResilience

1 0 0 0