قرآن در کامپیوتر

دانلود نرم افزار های قرآنی یه صورت رایگان

قرآن در کامپیوتر

دانلود نرم افزار های قرآنی یه صورت رایگان

ارسال لاگ به الستیک

You can use the json input type in Filebeat to send JSON logs to Elasticsearch. Here is an example of how you can configure Filebeat to send JSON logs to Elasticsearch 1:

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /var/log/*.json
  json.keys_under_root: true
  json.add_error_key: true

output.elasticsearch:
  hosts: ["localhost:9200"]
  index: "myindex-%{+yyyy.MM.dd}"


Here is an example of a docker-compose file for

 Filebeat, Elasticsearch and Kibana with password :

version: '3.7'
services:
  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.16.2
    container_name: elasticsearch
    environment:
      - node.name=elasticsearch
      - cluster.name=elasticsearch-cluster
      - discovery.seed_hosts=elasticsearch
      - cluster.initial_master_nodes=elasticsearch
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
      - ELASTIC_PASSWORD=changeme
    ulimits:
      memlock:
        soft: -1
        hard: -1
    volumes:
      - esdata1:/usr/share/elasticsearch/data
    ports:
      - 9200:9200

  kibana:
    image: docker.elastic.co/kibana/kibana:7.16.2
    container_name: kibana
    environment:
      ELASTICSEARCH_URL: http://elasticsearch:9200
      ELASTICSEARCH_HOSTS: http://elasticsearch:9200
      ELASTICSEARCH_USERNAME: elastic
      ELASTICSEARCH_PASSWORD: changeme
    ports:
      - 5601:5601

  filebeat:
    image: docker.elastic.co/beats/filebeat:7.16.2
    container_name: filebeat
    user: root
    volumes:
      - ./filebeat.yml:/usr/share/filebeat/filebeat.yml
      - /var/lib/docker/containers:/var/lib/docker/containers
    command: filebeat -e --strict.perms=false

volumes:
  esdata1:
    driver: local

لاگ کردن در کنسول با فرمت Json

To print logs in JSON format with SLF4J in Spring Boot, you can use Logback as the logging framework. You can add the following dependencies to your pom.xml file:


<dependency>
    <groupId>net.logstash.logback</groupId>
    <artifactId>logstash-logback-encoder</artifactId>
    <version>6.6</version>
</dependency>


کش در درگاه اسپرینت

To cache Next.js pages in Spring Cloud Gateway, you can use Spring Cache and Caffeine. You can write a filter that caches the response for specific requests. Here is an example of how to do it:



@Component
@Slf4j
public class CacheResponseGatewayFilterFactory extends AbstractGatewayFilterFactory<CacheResponseGatewayFilterFactory.Config> {

    public CacheResponseGatewayFilterFactory() {
        super(Config.class);
    }

    @Override
    public GatewayFilter apply(Config config) {
        return (exchange, chain) -> {
            String cacheKey = exchange.getRequest().getURI().toString();
            Cache.ValueWrapper
cachedResponse = cacheManager.getCache("myCache").get(cacheKey);
            if (cachedResponse != null) {
                log.info("Returning cached response for {}", cacheKey);
                return Mono.just(cachedResponse.get());
            }
            return chain.filter(exchange).doOnNext(response -> {
                if (response.getStatusCode().is2xxSuccessful()) {
                    log.info("Caching response for {}", cacheKey);
                    cacheManager.getCache("myCache").put(cacheKey, response);
                }
            });
};
    }

    public static class Config {
        // Put the configuration properties here
    }

}