#usersession
Explore tagged Tumblr posts
digitaltoolsblog · 2 years ago
Text
#hotjar Learn how Invision's digital product team uses Hotjar to improve the customer experience. ... "Hotjar is a critical tool for us to understand our users
#VWO is the market-leading A/B testing tool that fast-growing companies use for experimentation & conversion rate optimization.
#uxtweak The only UX research platform you need. Conduct UX research where users complete specific tasks directly on your web. Try it for free!
#inspectlet Session Recording, Website Heatmaps, Javascript A/B Testing, Error Logging, Form Analytics. Replay user sessions. Eye-tracking heatmaps.
#Smartlook is the only combined quantitative + qualitative analytics platform that's actually easy for growing teams to set up and adopt. Get a 1-to-1 demo.
0 notes
hackgit · 3 years ago
Text
Get-UserSession Queries user sessions for the entire domain (Interactive/RDP etc), allowing you...
Get-UserSession Queries user sessions for the entire domain (Interactive/RDP etc), allowing you to query a user and see all his logged on sessions, whether Active or Disconnected https://github.com/YossiSassi/Get-UserSession #pentesting #ad #redteam #hackers #tactics
Tumblr media
GitHub - YossiSassi/Get-UserSession: Query user sessions for the entire domain (Interactive/RDP etc), allowing you to query a Username… - GitHub Query user sessions for the entire domain (Interactive/RDP etc), allowing you to query a Username and see all their logged on sessions, whether Active or Disconnected - GitHub - YossiSassi/Get-User...
0 notes
sagar-jaybhay · 5 years ago
Text
Constraints In SQL By Sagar Jaybhay
New Post has been published on https://is.gd/zFLU3k
Constraints In SQL By Sagar Jaybhay
Tumblr media
Constraints In SQL
Constraints are the rules which are enforced on columns of the table in database.  They specifically used to limit or restrict that data goes into a column. Constraints ensure the reliability and accuracy of data.
We can apply constraints on column level or table level. In this column level constraints are applied on one column at a time but when you use table-level constraints these are applied on all columns of that table.
Commonly used Column constraints are below
Not Null
Default
Unique
Primary Key
Foreign key
Check
Index
Commonly used table-level constraints
Primary key
Foreign key
Unique
Check
In case if you insert any data that violets the constraints then the operation is aborted.
Constraints can be applied at the time of table creation by using Create Table syntax and another is with Alter table statement.
Default Constraint
This constraint is used to set or specify a default value for that column if any value doesn’t provide. This means it is used to insert a default value into a column. The default value is set for all records if any value doesn’t provide including Null.
To alter an existing table or add default constraint to a column using Alter table syntax.
Alter table table_name Add constraint constraint_name Default default_value for Column_name
To add a new column to an existing table with a default value
Alter table table_name Add column_name data_type (Null | Not Null) Constraint constraint_name default default_value
Drop Constraint                
Alter table table_name Drop constraint constraint_name
alter table Person add constraint df_value default 3 for [genederID]
Check Constraint
The check constraint is used to limit the range of the values which are entered for a specific column. In our case, a person’s table is already created. So we can add a new column age with check constraint by using alter table syntax.
alter table person add age int constraint chk_age check(age>0 and age<200)
Now if I going to insert negative value in that person column by using below query
insert into Person values(7,'rr1','[email protected]',Null,-9)
it will throw below error Msg 547, Level 16, State 0, Line 50 The INSERT statement conflicted with the CHECK constraint “chk_age”. The conflict occurred in database “temp”, table “dbo.Person”, column ‘age’. The statement has been terminated. The only flaw of this if you pass the null value it will be inserted any way and not throw any error.
insert into Person values(9,'rr2','[email protected]',Null,NULL)
this query works perfectly.
How Check constraint works?
When we add check constraint we add some condition in parenthesis. It is actually Boolean Expression when we pass value it will first pass to that expression and the expression returns the value.
If it returns true value then check constraint allows the value otherwise it doesn’t allow that value. So what happens when we pass the null value? In this person’s age case when we pass NULL value it passes to expression and expression is evaluated this as Unknown so for that reason it allows null value.
Drop a Check Constraint Alter table person Drop constraint chek_constraint_name
Unique Key Constraint
This unique key constraint is used to enforce the uniqueness of a column i.e column shouldn’t allow any duplicate value. You can add unique key constraints by using a designer or by using a query.
 Below is the syntax for add unique constraint by using alter table syntax.
Alter table table_name Add constraint constraint_name unique(column_name);
If you see both primary key and unique keys are used to enforce the uniqueness of column so the question in your mind when to use what?
One table has only one primary key and if you want to add uniqueness for more than one column you can use unique key constraints.
What is the difference between the Unique key and Primary Key?
A table has only one primary key but a table can have more than one unique key
The primary key doesn’t allow null values where a unique key allows only one null value.
alter table person add constraint unique_name_key Unique([email]);
What if you enter the same value again for a unique constraint?
insert into Person values(10,'rr2','[email protected]',Null,NULL)
I use the above query to insert value I only change a primary key value and all record is present previously in a table at 9’Th location but when I ran this query I get the following result
Msg 2627, Level 14, State 1, Line 87 Violation of UNIQUE KEY constraint ‘unique_name_key’. Cannot insert duplicate key in object ‘dbo.Person’. The duplicate key value is ([email protected]). The statement has been terminated.
How to drop unique key constraint?
Alter table table_name Drop constraint constraint_name;
alter table person drop constraint unique_name_key
Not Null Constraint
By default, a column can contain null values but you want to restrict the column that not to allow NULL values then this constraint is used.
This not null constraint enforces a rule on a column that always contains a value.
CREATE TABLE Persons ( ID int NOT NULL, LastName varchar(255) NOT NULL, FirstName varchar(255) NOT NULL, Age int );
Not Null constraint by using Alter table syntax
ALTER TABLE Persons MODIFY Age int NOT NULL;
Index Constraint
It is used to access the data very fast means for faster retrieval of data index is created. An index can be created on a single column or multiple columns. When you create an index it will create or assign rowed for each row.
Indexes have a good performance on large databases when it comes to retrieval of data but performance is low when insertion.
create table Person(ID int not null Primary key,name varchar(100),email varchar(100),genederID int)
CREATE INDEX index_name ON table_name ( column1, column2.....);
CREATE INDEX person_tabel_index ON person (id,name);
How to drop an Index?
ALTER TABLE table_name DROP INDEX index_name;
But you will get the following error
Msg 10785, Level 16, State 2, Line 97 The operation ‘ALTER TABLE DROP INDEX’ is supported only with memory-optimized tables. Msg 1750, Level 16, State 0, Line 97 Could not create constraint or index. See previous errors.
To avoid this error your table needs to memory-optimized and for that when you create a table use this syntax to create table memory optimize. But for that, your database needs to memory-optimized so use below command for that in below query temp is my database name.
ALTER DATABASE temp ADD FILEGROUP [TestDBSampleDB_mod_fg] CONTAINS MEMORY_OPTIMIZED_DATA; After that use below command ALTER DATABASE temp ADD FILE (NAME='temp_mod_dir', FILENAME='D:\timepass\TestDB_mod_dir') TO FILEGROUP [TestDBSampleDB_mod_fg]; --Then you use this create table command to create memory optimized table. CREATE TABLE userSession ( SessionId int not null, UserId int not null, CreatedDate datetime2 not null, ShoppingCartId int index ix_UserId nonclustered hash (UserId) with (bucket_count=400000) ) WITH (MEMORY_OPTIMIZED=ON, DURABILITY=SCHEMA_ONLY) ;
Primary key Constraint
A primary key is a field in the table which uniquely identifies the row in a table. The primary key contains unique values. The primary key doesn’t have null values.
A primary key is one per table but it can contain more than one column and this called a Composite key. Create primary key at the time of table creation
CREATE TABLE CUSTOMERS( ID INT NOT NULL, NAME VARCHAR (20) NOT NULL, AGE INT NOT NULL, ADDRESS CHAR (25), SALARY DECIMAL (18, 2), PRIMARY KEY (ID) );
Create a primary key using the alter table
ALTER TABLE CUSTOMER ADD PRIMARY KEY (ID);
When you use alter table syntax you need to ensure that the column which you provide is Not Null.
Foreign Key Constraint
The foreign key constraint is used to join 2 tables together.
It is a key which might have a combination of one or more column or fields in one table that refers to Primary Key in another table.
The table which contains foreign key is called the child table and the table containing the candidate key is called the referenced or parent table.
The relationship between 2 tables matches the Primary Key in one of the tables with a Foreign Key in the second table.
If a table has a primary key defined on any field(s), then you cannot have two records having the same value of that field(s).
CREATE TABLE Orders ( OrderID int NOT NULL, OrderNumber int NOT NULL, PersonID int, PRIMARY KEY (OrderID), FOREIGN KEY (PersonID) REFERENCES Persons(PersonID) ); --Foreign key using alter table ALTER TABLE Orders ADD FOREIGN KEY (PersonID) REFERENCES Persons(PersonID); --Drop a Foreign Key constraint ALTER TABLE Orders DROP FOREIGN KEY FK_PersonOrder;
0 notes
moderntechnolab · 6 years ago
Link
Microsoft launches its Clarity web analytics tool for A/B testing and visualizing user sessions #microsoft #webanalytics #visualizing #usersessions https://ift.tt/2Hs2CnI
0 notes
lifelongprogrammer · 8 years ago
Text
Caching Data in Spring Using Redis
Via http://ift.tt/2jM8cW4
The Scenario We would like to cache Cassandra data to Redis for better read performance. Cache Configuration To make data in Redis more readable and easy for troubleshooting and debugging, we use GenericJackson2JsonRedisSerializer to serialize value as Json data in Redis, use StringRedisSerializer to serialize key. To make GenericJackson2JsonRedisSerializer work, we also configure objectMapper to store type info: objectMapper.enableDefaultTyping(ObjectMapper.DefaultTyping.NON_FINAL); We configured cacheManager to store null value. - We use the configuration-driven approach and have a lot of configurations; We define default configuration values in property files. In the code, it first read from db and if null then read from property files. This makes us want to cache null value. We also use SpEL to set different TTL for different cache. redis.expires={configData:XSeconds, userSession: YSeconds}
@Configuration @EnableCaching public class RedisCacheConfig extends CachingConfigurerSupport { @Value("${redis.hostname}") String redisHostname; @Value("${redis.port}") int redisPort; @Value("#{${redis.expires}}") private Map<String, Long> expires; @Bean public JedisConnectionFactory redisConnectionFactory() { final JedisConnectionFactory redisConnectionFactory = new JedisConnectionFactory(); redisConnectionFactory.setHostName(redisHostname); redisConnectionFactory.setPort(redisPort); redisConnectionFactory.setUsePool(true); return redisConnectionFactory; } @Bean public ObjectMapper objectMapper() { final ObjectMapper objectMapper = Util.createFailSafeObjectmapper(); objectMapper.enableDefaultTyping(ObjectMapper.DefaultTyping.NON_FINAL); return objectMapper; } @Bean("redisTemplate") public RedisTemplate<String, Object> genricJacksonRedisTemplate(final JedisConnectionFactory cf, final ObjectMapper objectMapper) { final RedisTemplate<String, Object> redisTemplate = new RedisTemplate<>(); redisTemplate.setKeySerializer(new StringRedisSerializer()); redisTemplate.setHashKeySerializer(new StringRedisSerializer()); redisTemplate.setValueSerializer(new GenericJackson2JsonRedisSerializer(objectMapper)); redisTemplate.setHashValueSerializer(new GenericJackson2JsonRedisSerializer(objectMapper)); redisTemplate.setConnectionFactory(cf); return redisTemplate; } @Bean public CacheManager cacheManager(final RedisTemplate<String, Object> redisTemplate) { final RedisCacheManager cacheManager = new RedisCacheManager(redisTemplate, Collections.<String>emptyList(), true); cacheManager.setDefaultExpiration(86400); cacheManager.setExpires(expires); cacheManager.setLoadRemoteCachesOnStartup(true); return cacheManager; } }
Cache CassandraRepository
@Repository @CacheConfig(cacheNames = Util.CACHE_CONFIG) public interface ConfigurationDao extends CassandraRepository<Configuration> { @Query("Select * from configuration where name=?0") @Cacheable Configuration findByName(String name); @Query("Delete from configuration where name=?0") @CacheEvict void delete(String name); @Override @CacheEvict(key = "#p0.name") void delete(Configuration config); /* * Check http://ift.tt/1LItlnQ * about what #p0 means */ @Override @SuppressWarnings("unchecked") @CachePut(key = "#p0.name") Configuration save(Configuration config); /* * This API doesn't work very well with cache - as spring cache doesn't support put or evict * multiple keys. Call save(Configuration config) in a loop instead. */ @Override @CacheEvict(allEntries = true) @Deprecated <S extends Configuration> Iterable<S> save(Iterable<S> configs); /* * This API doesn't work very well with cache - as spring cache doesn't support put or evict * multiple keys. Call delete(Configuration config) in a loop instead. */ @Override @CacheEvict(allEntries = true) @Deprecated void delete(Iterable<? extends Configuration> configs); }
Admin API to Manage Cache  We inject CacheManager to add or evict data from Redis. But to scan all keys in a cache(like: cofig), I need to use stringRedisTemplate.opsForZSet() to get keys of the cache: - as the value in the cache (config), its value is a list of string keys. So here I need use StringRedisTemplate to read it. After get the keys, I use redisTemplate.opsForValue().multiGet to get their values. - I will update this post if I find some better ways to do this. 
public class CacheResource { private static final String REDIS_CACHE_SUFFIX_KEYS = "~keys"; @Autowired @Qualifier("redisTemplate") RedisTemplate<String, Object> redisTemplate; @Autowired @Qualifier("stringRedisTemplate") StringRedisTemplate stringRedisTemplate; @Autowired private CacheManager cacheManager; /** * If sessionId is not null, return its associated user info.<br> * It also returns other cached data: they are small data. * * @return */ @GetMapping(produces = MediaType.APPLICATION_JSON_VALUE, path = "/cache") public Map<String, Object> get(@RequestParam("sessionIds") final String sessionIds, @RequestParam(name = "getConfig", defaultValue = "false") final boolean getConfig) { final Map<String, Object> resultMap = new HashMap<>(); if (getConfig) { final Set<String> configKeys = stringRedisTemplate.opsForZSet().range(Util.CACHE_CONFIG_DAO + REDIS_CACHE_SUFFIX_KEYS, 0, -1); final List<Object> objects = redisTemplate.opsForValue().multiGet(configKeys); resultMap.put(Util.CACHE_CONFIG + REDIS_CACHE_SUFFIX_KEYS, objects); } if (StringUtils.isNotBlank(sessionIds)) { final Map<String, Object> sessionIdToUsers = new HashMap<>(); final Long totalUserCount = stringRedisTemplate.opsForZSet().size(Util.CACHE_USER + REDIS_CACHE_SUFFIX_KEYS); sessionIdToUsers.put("totalUserCount", totalUserCount); final ArrayList<String> sessionIdList = Lists.newArrayList(Util.COMMA_SPLITTER.split(sessionIds)); final List<Object> sessionIDValues = redisTemplate.opsForValue().multiGet(sessionIdList); for (int i = 0; i < sessionIdList.size(); i++) { sessionIdToUsers.put(sessionIdList.get(i), sessionIDValues.get(i)); } resultMap.put(Util.CACHE_USER + REDIS_CACHE_SUFFIX_KEYS, sessionIdToUsers); } return resultMap; } @DeleteMapping("/cache") public void clear(@RequestParam("removeSessionIds") final String removeSessionIds, @RequestParam(name = "clearSessions", defaultValue = "false") final boolean clearSessions, @RequestParam(name = "clearConfig", defaultValue = "false") final boolean clearConfig) { if (clearConfig) { final Cache configCache = getConfigCache(); configCache.clear(); } final Cache userCache = getUserCache(); if (clearSessions) { userCache.clear(); } else if (StringUtils.isNotBlank(removeSessionIds)) { final ArrayList<String> sessionIdList = Lists.newArrayList(Util.COMMA_SPLITTER.split(removeSessionIds)); for (final String sessionId : sessionIdList) { userCache.evict(sessionId); } } } /** * Only handle client() data - as other caches such as configuration we can use server side api * to update them */ @PutMapping("/cache") public void addOrupdate(...) { if (newUserSessions == null) { return; } final Cache userCache = getUserCache(); // userCache.put to add key, value } private Cache getConfigCache() { return cacheManager.getCache(Util.CACHE_CONFIG_DAO); } private Cache getUserCache() { return cacheManager.getCache(Util.CACHE_USER); } }
StringRedisTemplate
@Bean("stringRedisTemplate") public StringRedisTemplate stringRedisTemplate(final JedisConnectionFactory cf, final ObjectMapper objectMapper) { final StringRedisTemplate redisTemplate = new StringRedisTemplate(); redisTemplate.setConnectionFactory(cf); return redisTemplate; }
Misc - If you want to disable cache in some env, use NoOpCacheManager.
From lifelongprogrammer.blogspot.com
0 notes
lifelongprogrammer · 8 years ago
Link
From http://ift.tt/1ajReyV
The Scenario We would like to cache Cassandra data to Redis for better read performance. Cache Configuration To make data in Redis more readable and easy for troubleshooting and debugging, we use GenericJackson2JsonRedisSerializer to serialize value as Json data in Redis, use StringRedisSerializer to serialize key. To make GenericJackson2JsonRedisSerializer work, we also configure objectMapper to store type info: objectMapper.enableDefaultTyping(ObjectMapper.DefaultTyping.NON_FINAL); We configured cacheManager to store null value. - We use the configuration-driven approach and have a lot of configurations; We define default configuration values in property files. In the code, it first read from db and if null then read from property files. This makes us want to cache null value. We also use SpEL to set different TTL for different cache. redis.expires={configData:XSeconds, userSession: YSeconds}
@Configuration @EnableCaching public class RedisCacheConfig extends CachingConfigurerSupport { @Value("${redis.hostname}") String redisHostname; @Value("${redis.port}") int redisPort; @Value("#{${redis.expires}}") private Map<String, Long> expires; @Bean public JedisConnectionFactory redisConnectionFactory() { final JedisConnectionFactory redisConnectionFactory = new JedisConnectionFactory(); redisConnectionFactory.setHostName(redisHostname); redisConnectionFactory.setPort(redisPort); redisConnectionFactory.setUsePool(true); return redisConnectionFactory; } @Bean public ObjectMapper objectMapper() { final ObjectMapper objectMapper = Util.createFailSafeObjectmapper(); objectMapper.enableDefaultTyping(ObjectMapper.DefaultTyping.NON_FINAL); return objectMapper; } @Bean("redisTemplate") public RedisTemplate<String, Object> genricJacksonRedisTemplate(final JedisConnectionFactory cf, final ObjectMapper objectMapper) { final RedisTemplate<String, Object> redisTemplate = new RedisTemplate<>(); redisTemplate.setKeySerializer(new StringRedisSerializer()); redisTemplate.setHashKeySerializer(new StringRedisSerializer()); redisTemplate.setValueSerializer(new GenericJackson2JsonRedisSerializer(objectMapper)); redisTemplate.setHashValueSerializer(new GenericJackson2JsonRedisSerializer(objectMapper)); redisTemplate.setConnectionFactory(cf); return redisTemplate; } @Bean public CacheManager cacheManager(final RedisTemplate<String, Object> redisTemplate) { final RedisCacheManager cacheManager = new RedisCacheManager(redisTemplate, Collections.<String>emptyList(), true); cacheManager.setDefaultExpiration(86400); cacheManager.setExpires(expires); cacheManager.setLoadRemoteCachesOnStartup(true); return cacheManager; } }
Cache CassandraRepository
@Repository @CacheConfig(cacheNames = Util.CACHE_CONFIG) public interface ConfigurationDao extends CassandraRepository<Configuration> { @Query("Select * from configuration where name=?0") @Cacheable Configuration findByName(String name); @Query("Delete from configuration where name=?0") @CacheEvict void delete(String name); @Override @CacheEvict(key = "#p0.name") void delete(Configuration config); /* * Check http://ift.tt/1LItlnQ * about what #p0 means */ @Override @SuppressWarnings("unchecked") @CachePut(key = "#p0.name") Configuration save(Configuration config); /* * This API doesn't work very well with cache - as spring cache doesn't support put or evict * multiple keys. Call save(Configuration config) in a loop instead. */ @Override @CacheEvict(allEntries = true) @Deprecated <S extends Configuration> Iterable<S> save(Iterable<S> configs); /* * This API doesn't work very well with cache - as spring cache doesn't support put or evict * multiple keys. Call delete(Configuration config) in a loop instead. */ @Override @CacheEvict(allEntries = true) @Deprecated void delete(Iterable<? extends Configuration> configs); }
Admin API to Manage Cache  We inject CacheManager to add or evict data from Redis. But to scan all keys in a cache(like: cofig), I need to use stringRedisTemplate.opsForZSet() to get keys of the cache: - as the value in the cache (config), its value is a list of string keys. So here I need use StringRedisTemplate to read it. After get the keys, I use redisTemplate.opsForValue().multiGet to get their values. - I will update this post if I find some better ways to do this. 
public class CacheResource { private static final String REDIS_CACHE_SUFFIX_KEYS = "~keys"; @Autowired @Qualifier("redisTemplate") RedisTemplate<String, Object> redisTemplate; @Autowired @Qualifier("stringRedisTemplate") StringRedisTemplate stringRedisTemplate; @Autowired private CacheManager cacheManager; /** * If sessionId is not null, return its associated user info.<br> * It also returns other cached data: they are small data. * * @return */ @GetMapping(produces = MediaType.APPLICATION_JSON_VALUE, path = "/cache") public Map<String, Object> get(@RequestParam("sessionIds") final String sessionIds, @RequestParam(name = "getConfig", defaultValue = "false") final boolean getConfig) { final Map<String, Object> resultMap = new HashMap<>(); if (getConfig) { final Set<String> configKeys = stringRedisTemplate.opsForZSet().range(Util.CACHE_CONFIG_DAO + REDIS_CACHE_SUFFIX_KEYS, 0, -1); final List<Object> objects = redisTemplate.opsForValue().multiGet(configKeys); resultMap.put(Util.CACHE_CONFIG + REDIS_CACHE_SUFFIX_KEYS, objects); } if (StringUtils.isNotBlank(sessionIds)) { final Map<String, Object> sessionIdToUsers = new HashMap<>(); final Long totalUserCount = stringRedisTemplate.opsForZSet().size(Util.CACHE_USER + REDIS_CACHE_SUFFIX_KEYS); sessionIdToUsers.put("totalUserCount", totalUserCount); final ArrayList<String> sessionIdList = Lists.newArrayList(Util.COMMA_SPLITTER.split(sessionIds)); final List<Object> sessionIDValues = redisTemplate.opsForValue().multiGet(sessionIdList); for (int i = 0; i < sessionIdList.size(); i++) { sessionIdToUsers.put(sessionIdList.get(i), sessionIDValues.get(i)); } resultMap.put(Util.CACHE_USER + REDIS_CACHE_SUFFIX_KEYS, sessionIdToUsers); } return resultMap; } @DeleteMapping("/cache") public void clear(@RequestParam("removeSessionIds") final String removeSessionIds, @RequestParam(name = "clearSessions", defaultValue = "false") final boolean clearSessions, @RequestParam(name = "clearConfig", defaultValue = "false") final boolean clearConfig) { if (clearConfig) { final Cache configCache = getConfigCache(); configCache.clear(); } final Cache userCache = getUserCache(); if (clearSessions) { userCache.clear(); } else if (StringUtils.isNotBlank(removeSessionIds)) { final ArrayList<String> sessionIdList = Lists.newArrayList(Util.COMMA_SPLITTER.split(removeSessionIds)); for (final String sessionId : sessionIdList) { userCache.evict(sessionId); } } } /** * Only handle client() data - as other caches such as configuration we can use server side api * to update them */ @PutMapping("/cache") public void addOrupdate(...) { if (newUserSessions == null) { return; } final Cache userCache = getUserCache(); // userCache.put to add key, value } private Cache getConfigCache() { return cacheManager.getCache(Util.CACHE_CONFIG_DAO); } private Cache getUserCache() { return cacheManager.getCache(Util.CACHE_USER); } }
StringRedisTemplate
@Bean("stringRedisTemplate") public StringRedisTemplate stringRedisTemplate(final JedisConnectionFactory cf, final ObjectMapper objectMapper) { final StringRedisTemplate redisTemplate = new StringRedisTemplate(); redisTemplate.setConnectionFactory(cf); return redisTemplate; }
Misc - If you want to disable cache in some env, use NoOpCacheManager.
0 notes